Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 83
SAlternative Perspectives on Quality in Student Financial Aid Programs In the preceding chapter, the panel carefully defined the types of error that are encountered in the student financial aid system and then discussed the quality control and measurement procedures that are currently in use. In this chapter we consider quality from the perspective of three partici- pants with major responsibilities for the control of error in the system the Department of Education, the student applicant, and the postsecondary in- stitution. We begin with a discussion of the attempts by the Department of Education to gauge the "state of error" in the system through sample sur- veys, particularly the most recent major study of the Integrated Quality Control Measurement Project. After a discussion of the shortcomings of that study and problems with the large-scale survey approach to error mea- surement in general, we turn to an examination of possible systemic causes of error by considering the complexity of the process from the viewpoint of the individual applicant. In the final section of the chapter, we consider some of the problems in maintaining quality that are faced by the postsecondary insti- tution the bearer of legal and moral responsibility for the just administration of financial aid. Not surprisingly, the perspectives of the three players differ in many respects. Each viewpoint, however, constitutes an important dimen- sion of the "total picture" of quality in the student financial aid system. SURVEYS OF ERROR Thus far, the sole attempts to measure errors throughout the student financial aid system have been through a sequence of special surveys of aid 83
OCR for page 84
84 Q UALITY IN STUDENT FINANCIAL AID PROGRAMS recipients commissioned by the Department of Education. The panel first discusses these studies and then addresses the issue of nonrecipients. The early studies (see Table 5-1) focused on errors in Pell grants. More recent studies included estimates of error in Campus-Based and Guaranteed Stu- dent Loan awards as the survey designs were developed to select cases in those programs. Our focus in this discussion is on two of the department's TABLE 5-1 Previous Quality Control Studies Commissioned by the Department of Education Year(s) 1975 1977 Focus of Study Office of Education study. Compared IRS records with data from applicant. Repeated in 1977 and 1980. Report of Student Financial Assistant Group. Sources of data were public testimony, previous studies, and audits. 1979-80 Basic Educational Opportunity Grant study. Examined application data and institutional records. 1980-83 Pell Grant Quality Control Study. Studied error in the Pell Grant Program. The study, which consisted of two large national surveys, compared delivery systems, assessed options for redesigning delivery systems, and developed the Institutional Quality Control Handbook. The study concluded that error in the Pell program was significant, and aside from the need for program and policy restructuring, quality could be improved through implementation of verification and validation efforts. 1983-86 Stage One of the Title IV Quality Control Study. A national survey of 1987 recipients of Campus-Based aid and certified applicants of Guaranteed Student Loans (GSLs). This study examined error in the Campus-Based and GSL programs. Stage Two of the Title IV Quality Control Study. A national survey of recipients of Pell grants, Campus-Based aid, and certified applicants of GSLs. This study examined error in the Title IV programs. Stage Two of the Title IV Quality Control Study. Comparison with other federal programs and assessment of potential incentives and . . . disincentives. 1985-92 The Institutional Quality Control Pilot Project. Assisted educational institutions in designing their own quality improvement programs and demonstrated the applicability of quality control procedures for financial aid at the institutional level. 1986 Guaranteed Student Loan Quality Control Project. Identified and measured error in the GSL program due to financial institutions, guaranty agencies, and the Department of Education. 1988-90 The Integrated Quality Control Measurement Project. Examined the extent of error in the delivery of Title IV aid, patterns and trends in error that point to systemic problems in the delivery system, possible causes of error and steps that can be taken to enhance quality in the future, and the extent to which prior actions have improved quality. SOURCE: Based in part on Chauvin et al. (1989).
OCR for page 85
OCR for page 87
OCR for page 88
OCR for page 89
OCR for page 90
OCR for page 91
OCR for page 92
OCR for page 93
OCR for page 106
OCR for page 107
OCR for page 108
OCR for page 109
OCR for page 110
OCR for page 111
OCR for page 112
OCR for page 113
OCR for page 114
OCR for page 115
OCR for page 116
Representative terms from entire chapter:
ALTERNATIVE PERSPECTIVES ON QUALITY 85 commissioned reports, Title IV Quality Control Project: Stage Two, Final Report, Vol. 2, Corrective Actions (Advanced Technology and Westat, 1987b) ~nnr1 7nt~or~fPH O`~:lilv Cnnf:rr,I M~.
86 -= _ ._ Go ~or o ~ Do o _ Ct _ ~ ~ ~Do o ~ ~ o EN ~_ =- s ~of to ~ ~Cal ~ Do ._ _ ~o _ ~ ~oo > ~ ~- ,~ US ~ Go ~ ~ ~ 3 ~¢ [11 ~_ ~Cal _ ~ ~ O ~ ~ Cal - Cal ~ O Ce . ~ ~hi 0 ~O X - 00 ~ 3 ~ ~ rid ~cat an,- hi - o - - ~ ~rid ~ o so t_ _ ~ o Do ~Go ~ ~ ~ ~ ~V ~_ ._ ~ ~ ~o ~ ~o. ~U~ c ~ct ·- s ~r ~ ~0 - ~ 0 c~ $_ Co :- ~C ~oo ~ C ~V) _ . ~- .-. ~ - O 3 0~ ~ o ~ c ~~) ~O0 <) ~ Ct ·- ~- ) C~ t- ~ ~O ~ ~ - o c~ ~ 3 ~1 ~t ~t r _ 50 ~ =\ ~t 00 00 0 ·aOO s ~ o o o o o o o _ _ _ o C~ .~ ~o o ~ ~C~ ~ C~ . _ C~ V) C~ . _ C) Cq ~ ~_ ~ ~o ~C~ ~ ~ ~ ~oo oo oo oo oo oc oo oo oo i_ ~ ~ oo o ~ ~U~ ~oo oo oo ~C~ Ct ~oo oo oo oo oo oo oo X oo c ~ ~ax ^ ~ c ~E ~2> -,, ~ v, o ~ s , ~s ~C) Ct ~J Ct o u ~, ~ E~ ~,= ~ ~ m ~<,,, m Cll o O _ e _ O _ e _ E ~ s :;- E ~c) _ ~ ~ 0) ~ c.> ~ ~ ° c~ ~ ~ ~ ~V ~V ~ ~ V V ~ ·- ~ V ~ .= O ce oo ~ (5\ ~ _ ._ c' c' ct ~ ~ .= O uO: 0 ~ ~ c: 3 -5 Ct Ct e~ ._ 5: Ct .0 3 Ct ~ .o C) - O C;: ~ ,~, ~ O 5 0 ._ C~ c;} ~ . 5 ._ ._ CC ~ O O O U) .e~ O s~ C) ~- ·~ O .5 ~ · C) Ce - 0 ~ D ~ 3 ~ t .~ C,<~ 0 3 ._ · ~ C~3 · _ u: - c~ ._ ~ (~N c, ~ ct ,3 °° h, _ ~ ~ E-. 0 ~ ~ s ° s°- c~ s c~ u: 0 ~ c~ ~ D 3 .= ~ O ~ C~ ~ Cq ·- ._ ~ U: ~ ~ ~ Ct U~ -, C~ t ._ C~ 1 I ~ 0 ~ -° 3 -^ ~ 0 ~ v, ~ - . _ m ~ ~ ~ Ct C~ ~ O 5~ ~ '1)0 ~ ~O _ 55 C.) ~ V ~0D ~ 5 - CtO Ct ~3 s s~ ~ . _.= 0 ce~ ~ _0 ct . ~ ~ ~ ~ ~ ._ · _ ~ce s~u: ~ ~ cq ~v 3 3 ~ ~ ~ 0 D C~ ~ , ~ C) ._ _ C) cL o~ ~ o . ~ e,. o ~ _ ~3 O C~ s: Ct C' ~ ~ au _ _ 50 Ct 5 . C~ O ~ =, O V C~ ~ s ~ o (v 50 1= c~ 'e ~o ( ~ c ~ ~ ~ o D ~._ ~ C~ _ 5 . C~ ~ 5 ~ ~ Ct U, ~ ~ e C 'V o E ~ ~ ~ _ ~ ~ s~ ·~=,C, ~ ct E~ .. ~ _ ~ . _ _ ~ oo ~ 3 ~ ,,, 0 ~ c.) ~ . oo _ 0 0
e o o ' s: ;^ - ~ o o - ~ o ~ . - ~ 3 D ;- C) ~ O o ~k o O ~ _ ~ ~* C~ ~ ._ C 3 ~ Ct o ~, o s: >, Ct C~ ~ ~ C) Ct Cq ~ ._ o ~ C~ - C<: 't ,3 C~ ~ ._ ~s (D ~, ._ ._ _ o s~ C~ - ,o ~ - ._ _~ ~ (* Ct o o cL1C) O oo r~ _CC ~ ~ C) Cto ~ ~._ ca_ ct C). _ ~ ~£ ¢ ;- - o ~o ~> ¢ · C~ X ,g C~ ~o C) o E~ · U: x ~ .= . . o ~ - ' o~ t ~ ._ o ~ - - (D - o v o~ ~: . ~cq - ~ ct ~ ~ . - o ~ ~ ~ ~s c) · - o : - ~ 'e ~ o ct ~ o ~ t -> c) - . . x . - o s~ z - - ct 3 ~ Ct ~ Q Ct Cq Ct ;°- c s ~ ~S Ct o C~ ~S Cq Ct 3 ._ C~ - Ct o E~ oc Cq Q~, o C) ~ Q _ s ~, ._ ~ ._ Ct - U: X ~t L~ ._ C) - ~ , C C~ _ ·- d O ~ s~ . . ~ ._ kL1 5 _ ._ X o C~ Ct ._ C~ C) C) ._ 3 o C o - C~ ._ ~> Cq C) - C) o Ct - o V) - C~ o o c5 ._ C~ ._ o ._ ._ o - - ; - - o - c) ct £ x o C~ C) s~ D - ~0 U) s~ _~ au - o C~ D Ct - C~ o o - - ._ o C~ ~3 Ct - o ._ . _ .= C~ Ct m C~ U, C~ ~ 3 _ _ , ~ . _ ~ ,.., - (, , ~ - , Ct ._ _ D O . _ . ~ X o ~I) ~ . . _ Ct - - D ._ X . _ C~ _' D ~U, X e~ =, D O C) _ s~ ^ .S O ~ <4-, ~O Ct ~ o C~ U: Ct 3 C~ - Ct U: ·- ~D Q ~0 ~i 3 Ct o ~i C) ` - o' ~ ' C~ C) ._ C~ - C~ o - - 87
88 QUALITY IN STUDENT FINANCIAL AID PROGRAMS TABLE 5-3 Components of Student and Institutional Error in the Pell Grant Program Academic Year Error Item $ Error (millions) Student error 1982-83a Dependency status 64 Other nontaxable income 46 Household size 34 Number in postsecondary schools 24 Home equity 18 Assets of dependent students 17 Adjusted gross income of parents and independent students 16 Income of dependent students 12 Taxes paid 2 1985-86b Other nontaxable income 75.1 Home equity 64.0 Dependency status 45.4 Dependent students' net assets 35.5 Students' expected income 32.6 Household size 29.9 Adjusted gross income 20.6 Number in college 18.4 1988-89C Parents' number in college 72 Parents' household size 70 Parents' home value 47 Students' 1987 adjusted gross income 44 Parents' real estate/investment value 32 Students' household size 31 Students' 1987 other nontaxable income and benefits 24 Parents' 1987 adjusted gross income 24 Institutional error 1982-834 Missing financial aid transcript 95 Incorrect determination of enrollment status -39 Incorrect calculation or disbursement of award 24 Incorrect determination of cost of attendance -21 1985 -86e Missing financial aid transcript 41.2 Missing Selective Service compliance statement 30.5 Missing statement of educational purpose 28.1 Award to students with bachelor's degree 13.6 Incorrect determination of enrollment status 9.6 Incorrect determination of cost of attendance 8.3 Loan default 4.7 Incorrect calculation or disbursement of award 3.7 continued
ALTERNATIVE PERSPECTIVES ON QUALITY TABLE 5-3 Continued 89 Academic Year Error Item $ Error (millions) 1988-~ Missing statement of educational purpose Missing Selective Service compliance statement Award to students with bachelor's degree Missing financial aid transcript Incorrect determination of cost of attendance Independent under unusual circumstances Default/repayment Half-time enrollment Factoring other aid 29 29 29 14 9 l not observed NA aGeneral Accounting Office (1985:Table 20, p. 57). bAdvanced Technology and Westat (1987a:Exhibit 3-2, p. 3-5). CThese numbers are rough estimates based on those values found on the graph (Price Waterhouse, 1 990b:Exhibit III- 1, p. 14). dNegative values are net underawards. General Accounting Office (1985:Table 22, p. 60). eAdvanced Technology and Westat (1987a:Exhibit 3-3, p. 3-8). fThese numbers are rough estimates based on those values found on the graph (Price Waterhouse, 1990b:Exhibit III-2, p. 16). are based on sample data and thus are themselves subject to the sources of error associated with surveys. These problems will be addressed later in this chapter. Assuming that errors are not made deliberately, several types of "stu- dent error" could not be corrected even with additional care on the part of the applicant. For example, some problems arise because, to apply for aid from some states and institutions, the application must be completed before the applicant completes his or her federal tax form. Thus, dependency status, taxes, and adjusted gross income are estimates and prone to error. Similarly, household size and the number in postsecondary schools are pro- jections for the following academic year and are subject to change as cir- cumstances change. In order to simplify the application procedure and include fewer items that are subject to error, the panel recommends the following: Recommendation 5-1: For applicants who have filed a tax return in either of the prior two tax years, the information used to complete the application and determine the award should be based on the most re- cently filed of these income tax returns. When the earlier tax year is used, updated information should be required as soon as a new tax return is filed. The updated information should not be used to change the award before the next term. Further, "household size" and "num- ber in postsecondary school" should be based on the situation as of the application date.
9o QUALITY IN STUDENT FINANCIAL AID PROGRAMS TABLE 5-4 Components of Student and Institutional Error in the Federal Student Financial Aid Programs, 1988-89 Dollar Error (million) Item Campus-Based Stafford Pell Grants Program Loans Student errora Parents' number in college Parents' household size Parents' home value Student's adjusted gross income Parents' real estate/investment value Student's household size Student's 1987 other nontaxable income and benefits Parents' 1987 adjusted gross income Institutional errorb 72 70 47 44 32 31 24 24 Missing statement of educational purpose Missing Selective Service compliance statement Award to students with bachelor's degree Missing financial aid transcript Incorrect determination of cost of attendance 9 Independent under unusual circumstances Default/Repayment Half-time enrollment Factoring other aid 82 60 31 29 13 31 23 33 29 29 29 14 s 29 6 1 21 59 not observed NA107 NA 61105 47 52 38 88 27 52 37 46 53 33 o 18 86 7 NOTE: "Home value" was removed from consideration under reauthorization. aThese numbers are rough estimates based on those values found on the graph (Price Waterhouse, 1990b:Exhibit III-1, p. 14). bThese numbers are rough estimates based on those values found on the graph (Price Waterhouse, 1990b:Exhibit III-2, p. 16). For example, applicants for the 1994-95 school year would use 1992 tax year data if they have not filed 1993 taxes at the time of application. Institutional Error The most recent study (Price Waterhouse, l990b) reported institutional error for three categories: procedural, calculation, and distribution errors. Procedural errors occur when institutions do not adhere to established guidelines for granting awards. Table 5-3 compares sources of institutional error over time in the Pelt Grant Program and Table 5-4 compares sources of institu
ALTERNATIVE PERSPECTIVES ON QUALITY 91 tional error by type of Title IV program for the 1988-89 academic year. Note that for incorrect determination of enrollment status and of cost of attendance, the estimated net error in the Pell Grant Program for 1982-83 was negative, which indicates that the estimated underpayments (which we show to be underestimated later in this chapter) were greater than the esti- mated overpayments for these error sources. Enrollment status does not appear to have been a major source of error in the Pell Grant Program in later years, but it persisted as a troublesome issue in the Stafford Loan Program. According to the Advanced Technology and Westat (1987b) re- port, confusion regarding enrollment status generally occurs in cases in- volving students attending summer sessions or clock-hour students. Incor- rect determination of cost of attendance causes a large dollar error to occur in all three Title IV programs. Incorrect calculation and distribution of awards are also persistent prob- lems for institutions. Calculation error, which Price Waterhouse (199Ob) reported as dominating institutional marginal errors, occurs when incorrect information is used to calculate awards; for example, errors made in deter- mining the correct cost of attendance. Factoring in other aid produces considerable calculation error as well. In the 1988-89 academic year, when a calculation error was made for 21 percent of applications, factoring error was found for 9 out of 10 applications in error. Distribution error occurs when an incorrect award amount is disbursed. With respect to the expected award or certification, the Price Waterhouse (199Ob) study found meaningful errors in the Campus-Based and Stafford loan programs, but acknowledged potential overstatement of the largest source. These errors do not affect the need calculations as do all the other errors discussed in the study. The panel will focus here on the award error compo- nents, since audits of financial administration were already discussed. As can be seen from Tables 5-3 and 5-4, technical errors, such as a missing statement of purpose, certification of Selective Service registration, or financial aid transcript, have a large dollar impact, for when these docu- ments are found to be missing, their absence renders the entire award erro- neous. Such cases may or may not otherwise be eligible for financial aid, and for those who are eligible, the calculated award may or may not be the correct amount. The General Accounting Office (GAO) requested an analy- sis by the Department of Education of cases in which technical errors due to missing documentation were ignored and the files reanalyzed for substan- tive errors. The analysis indicated that the estimated dollar value of institu- tional errors dropped by about a third when technical errors were removed, and the percentage of cases with "institutional error" dropped by half (Gen- eral Accounting Office, 1985~. Underawards, when technical errors were ignored, were more frequent and had a larger dollar value than overawards. Department of Education staff mentioned that these administrative require
92 QUALITY IN STUDENT FINANCIAL AID PROGRAMS meets are easy to observe during audits and review and can be indicators of more serious administrative problems. The panel believes that there is a need to refocus the effort used to detect technical errors. For example, potential improvements in the indicators of quality have been suggested by Goodwin (1991~. Recommendation 5-2: The Department of Education should determine which, if any, of the administrative requirements that can lead to "tech- nical error" are useful surrogates for variables that deal directly with program goals but are difficult to measure. Administrative require- ments that are useful in this sense should be retained, but others should be eliminated. The Most Recent Study of Error The IQCMP The Integrated Quality Control Measurement Project (IQCMP) is a large- scale survey of institutions and students who received financial aid during the 1988-89 academic year. The survey, designed and executed by Price Waterhouse in association with the Gallup Organization and Pelavin Associ- ates, involved the first-stage selection of 350 academic institutions and sec- ond-stage selection of 3,310 students therein, for a total of 2,653 usable award analyses. The study attempts to evaluate the quality of the Title IV delivery systems for the (1) Pell Grant Program, (2) Campus-Based Pro- grams (Supplemental Educational Opportunity Grant, College Work-Study, and Perkins Loan), and (3) Stafford Loan Program. The IQCMP is by far the most ambitious undertaking of its type since the inception of the Title IV programs, and the panel wishes to emphasize at the start of this discussion that the contractors should be commended for their efforts to obtain measures of error that can lead to meaningful im- provements in the system. Indeed, the panel considered the IQCMP to be so important to an understanding of the state of program error that it com- missioned a special background paper (see Appendix A). Many of the comments that follow are based on that paper, although it should be read in its own right for a fuller understanding of some of the problems with the Price Waterhouse study. Estimated Error The IQCMP provides estimates of errors of overpayment and underpay- ment in the distribution of Pell grant and Campus-Based financial aid, and estimates of overaward errors in Stafford loans. The errors are reported in absolute terms that is, both under- and overawards are assigned positive
ALTERNATIVE PERSPECTIVES ON QUALITY 93 values as their measure of error in quality-and they are reported in terms of percentage of total dollars disbursed, percentage of awards in error, and mean dollars of error per recipient in error. Components of Measured Errors A detailed description of the compo- nents of measured errors is contained in the report and need not be repeated here. However, it should be noted that the definition of error used in the study encompasses discrepancies between "actual" award and "best" award permitted under Title IV regulations. In some cases, this definition resulted in inclusion of some measured error that resulted from unavoidable projec- tion errors. For example, an overaward resulting from a student or family incorrectly projecting household size is counted as an error in the IQCMP, whereas a good faith error of this type is not an error under the Title IV regulations. Other cases were labeled as error because of discrepancies between federal regulations and state or institutional rules. For example, the failure of an academic institution to follow its own internal rules say, giving a certain student a Perkins loan that exceeds a stated maximum amount allowable is not a violation of federal rules, but nevertheless is a measurable error in the IQCMP. Although the appropriateness of some of the cases of error may be debatable, in general the broader definitions result in measures that are more closely associated with "quality of delivery" than would be obtained under stricter statutory definitions. Further, since the sample consists only of recipients, it cannot be used to estimate errors of nonpayment to eligible nonrecipients. For these three reasons (the use of absolute error, the broad definition of error, and the exclusion of nonrecipients from the estimates) the error estimates from the IQCMP do not represent dollars spent in error. Errors in the Estimation of Error In any sample survey aimed at as- sessing error in a process, there are two major sources of error in the sum- mary measurement of the error of interest. To avoid confusion in the mul- tiple use of the term error in the same sentence, we must distinguish between a "true" error in quality and an error in its observation. First, if a deviation in quality, when it occurs, can be detected and its magnitude observed with perfect accuracy (i.e., without error), then the only error of concern in assessment is the sampling error encountered when survey statistics are used to estimate population quantities such as averages and totals. Unfortu- nately, in most observational phenomena, there is another source of inaccu- racy, called nonsampling error. Within the realm of nonsampling error, one can further distinguish among causes due to problems of undercoverage, nonresponse, and mismeasurement. For example, in the IQCMP the frame from which the sample was drawn excluded certain newer institutions. This exclusion can lead to an undercoverage error if the part of the target popula- tion that had no chance of falling into the sample is sizeable and has a -
106 The 1992-93 Student Aid Report QUALITY IN STUDENT FINANCIAL AID PROGRAMS The panel also examined the Student Aid Report (SAR), which is sent to applicants after the paper version of the AFSA is processed. If an AFSA is not rejected for one of a variety of technical reasons, an applicant can expect to receive a SAR from the processor within about four weeks. The SAR contains the data given by the applicant on the AFSA, information from the Department of Education about the applicant's eligibility for fed- eral student aid, and instructions about what to do next. At this step, applicants also might receive either an Information Review form or an In- formation Request form on which to provide additional information or make corrections and to verify any assumption edits that were inserted during processing. Applicants who appear to qualify for Pell grants based on their initial application receive a payment voucher at this time. If corrections or additions to an application are needed, applicants can expect to wait two to three weeks after they submit changes for receipt of a revised SAR. As part of its study, the panel reviewed the comments of over 600 applicants who wrote to the Department of Education in 1991 in response to an invitation in the SAR to voice their comments or ideas for improving the SAR. Three-fourths of the comments referred to confusion in review- ing or correcting data due to a lack of correspondence between the item sequence on the SAR and the item sequence on the AFSA. Apparently, as students work their way through notations on the SAR, they must move back-and-forth within the AFSA to make corrections, all the while paying careful attention to the AFSA's color coding scheme. Thus, in line with the recommendations above, the panel also suggests that the Department of Education review the SAR with the goal of relieving the burden on the applicant. Suggestions about potential changes to the SAR are discussed in Appendix C. Electronic Version of the 1992-93 AFSA In addition to examining the paper versions of the AFSA and the SAR, the panel also reviewed the Department of Education's Electronic Data Ex- change application process- also known as Stage Zero (for details of this review, see Appendix C). According to the user's manual, the electronic application form for 1992-93 is "intended for student use in the financial aid office environment, with monitoring and counseling done as needed by financial aid staff. Some institutions may want to have aid administrator staff walk through the questions as the student enters his/her information. Others may have staff members use this entry mode while interviewing applicants" (U.S. Department of Education, 1991a:15-1~. In practice, most applicants work from an already completed paper version of the AFSA.
ALTERNATIVE PERSPECTIVES ON' QUALITY 107 Alternatively, an "expert" entry mode is provided so financial aid personnel can quickly enter data from applications initally submitted on paper. An advantage of the Stage Zero process is that it includes an Electronic Needs Analysis System, which enables financial aid personnel to use data already keyed by students to calculate a number of expected results for a given applicant. Using this system, financial aid personnel can calculate an expected family contribution using the Congressional Methodology. Other financial calculations, such as an estimated Pell Grant Index for determin- ing Pell eligibility, as well as an estimated Pell award amount, can be calculated (the Congressional Methodology and the now unused Pell Grant Index were discussed in Chapter 3~. In addition, Stage Zero allows verifi- cation selection criteria to be applied on-site. While the calculations are not official, they do provide immediate information to applicants. Further, an important incentive for using this system is that it can considerably shorten the time from application completion to receipt of federal aid. Stage Zero also can be used for electronic filing of renewal applica- tions, a process that was introduced for the 1992-93 school year and is available to returning students at selected institutions. Renewals for other students require filling out the entire paper AFSA, no matter how much of the data previously filed are still correct. In the electronic renewal process, only changes in data need be entered, after which a copy of the completed form is generated by the computer. Learning to use Stage Zero is easy. Experience with a keyboard is helpful, but even hunt-and-peck typists and others inexperienced with com- puters should be able to adapt to the system easily. Answers to most ques- tions are preceded and those that are not do not generally require lengthy answers. The first screen displayed by the program contains simple instruc- tions on using the system, and additional information is provided through help messages. On the screens that follow, the wording, sequencing, and numbering of items match the contents of the paper version of the AFSA. A limited number of questions appear on each screen, which overcomes the "crowded" look of the paper AFSA, and answer cells are consistently lo- cated on the right-hand side of the screen (except for long items, such as names and addresses), which eliminates the varied formats of the paper AFSA. Although the panel's review of Stage Zero revealed several minor prob- lems, the electronic application process appears to have many advantages over the paper process. Data entries can be proofread on screen, and errors can be corrected simply by backing up and rekeying the entries. In addi- tion, edit and query messages automatically appear on screen to alert the user to incorrect, incomplete, inconsistent, or suspect data. In some cases, previously inserted data are entered automatically when called for in later items. In addition, the complex skips called for in the paper version are
108 QUALITY IN STUDENT FINANCIAL AID PROGRAMS performed automatically, and help messages are available throughout the program. The program also contains worksheets that do the mathematical calculations called for in some parts of the application and then automati- cally insert the totals into the proper box on the application form. When using Stage Zero, an applicant can stop at any point in the pro- gram, although if the application is not completed, the entries made cannot be saved. The Department of Education should determine whether this is a shortcoming in need of action. At the conclusion of the program, applicants are asked if they want to review the application. Choosing "yes" causes the display to return to the beginning of the form. Corrections can be made easily, and when the applicant okays the data, the program automatically informs the user that the program is being validated. Corrections can also be made if inconsistencies are flagged during the validation process. If the application passes the validation stage, the program automatically saves the file and prints a summary of the data entered on a two-page printout, which also contains a certification statement for the student to sign. The Quality Control Guidelines in the user's manual instruct the applicant to then compare the answers on the printout with the responses on the applicant's paper form. However, this is a difficult task because item numbers are not shown on the printout and section names do not match those on the paper version of the AFSA. Moreover, applicants are in- structed to make written corrections on the printout, but that is not always possible due to a lack of space. Completed and validated applications can be sent to the Department of Education's Central Processing System automatically, or data files on floppy diskettes can be mailed. Applications can be processed and an Electronic Student Aid Report returned to the institution within 72 hours using the Electronic Data Exchange System. The Stage Zero process has many advantages. Many of the advantages are nullified, however, by having an applicant first fill out the paper AFSA and get the necessary signatures before filling out the electronic applica- tion. The panel believes that this does not take advantage of the many Stage Zero features that make it easier for the applicant to understand the information required on the application or to provide that information accu- rately. If applicants could start with Stage Zero, working from notes, and fill out the paper form concurrently or subsequently, errors and burden might be reduced. Moreover, the use of Stage Zero is currently limited to postsecondary educational institutions participating in student aid programs. However, a slightly modified version of the software might be appropriate for off-cam- pus use. In fact, many home and most office personal computers meet the requirements for installation of the 1992-93 Stage Zero version, and with appropriate modifications, it might be possible for high schools, churches,
ALTERNATIVE PERSPECTIVES ON QUALITY 109 community organizations, civic groups, and mentors with access to a com- puter to assist aid applicants in filling out the forms. Recommendations on Financial Aid Forms Based on its analyses of the various forms used in the federal student financial aid application process, the panel makes two recommendations: Recommendation 5-9: The complexity of the forms, instructions, and information booklets leads to excessive burden for applicants and is a cause of error. Thus, the Department of Education should consult ex- perts in form and question development, such as those found at cogni- tive research laboratories, to aid in its efforts to improve the applica tion materials. Recommendation 5-10: The Department of Education should continue to improve and expand the availability of its electronic data exchanges, making sure to address the issue of appropriate balance between the potentialfor improved data and additional burdens that might be placed on the schools. Although the panel was unable to estimate the amount of error caused by current versions of the AFSA and SAR or the reduction in error due to use of Stage Zero, the evidence reviewed above suggests that attention to these recommendations has considerable potential to reduce applicant bur- den and improve the quality of data. The reduction in burden, alone, is a quality dimension that should be of interest to a governmental agency. Im- proved data quality can reduce inspection and correction activities during SAR and verification reviews. POSTSECONDARY INSTITUTIONS Institutions are increasingly burdened by the administrative complexi- ties imposed by the federal government. From an original promise to ad- minister the Fell eligibility requirements centrally, the government has moved to an increasingly complex student aid report, which must be verified by school personnel. The addition of information unrelated to student need, such as Selective Service status, resulted in a form which has grown from two pages to four. In addition, hard copies of applications and related "underwriting" materials must be physically retained for long periods of time, which creates the need for expensive warehousing in the financial aid offices. Postsecondary institutions have other concerns as well. Various layers of aid and the complexity within each layer contribute to a cumbersome aid
0 QUALITY IN STUDENT FINANCIAL AID PROGRAMS structure. Additionally, with continuous regulatory changes and require- ments, schools must often adapt to changes in rules and accurately adminis- ter financial aid simultaneously. All factors considered, the system inevita- bly becomes error prone. In the remainder of this chapter, the panel relates concerns of the institutions that were repeatedly expressed by representa- tives of institutions and during visits to student financial aid offices and looks at the available measurements of those concerns. Administrative Cost and Regulatory Requirements Federal and Local Costs The program of oversight of the federal student aid programs appears to impose unreasonably high costs for inspection of processes. Contributing to the high costs of inspection are layers of redundant review. An indi- vidual applicant, for example, may be subject to repeated data and process examination of initial information as a student loan recipient, as a Pell grant recipient, as a part of internal audits, and as a part of federal audit requirements or program reviews. Cycles of reexamination are also problematic. Applications undergo separate inspections at each institution if the student is applying for or considering transfer to several institutions of higher education. Even a student who applies for and attends only one institution is not immune from repeated cycles of review of data each time credit load varies, income changes, a new award is received, or for a myriad of other conditions. Revision of each student award five times per academic year is not uncommon. Besides adding to the institution's work, these changes add uncertainty to the finan- cial planning of the student. State and School Program Coordination Although the subject of this study is the federal student aid process, it is important to note that to the student and institutional participants, the federal government, while dominant, is nonetheless only one provider of student aid (see Tables 5-5 and 5-6~. Assessment of the perceived quality of the aid process must consider all sources of aid because many students with high financial need would simply not be able to participate in higher educa- tion with federal funds alone, even with maximum funds available. This is especially true for a student living away from home at a private university, but it is also true for some public universities. The packaging of the various aid sources, however, adds to the administrative burden of the institution. State grant programs vary in the proportion of financial aid they pro- vide, but almost every state provides some level of support in addition to
ALTERNATIVE PERSPECTIVES ON QUALITY TABLE 5-5 Profile of Aid Packages, All Students 111 Number of Students with Campus- Stafford and Aid Combination (000s) Pen Grant Based SLS Loans Other Average Dollars per Recipient, by Program 10,578 2,912 607 592 50 141 92 295 613 636 490 485 189 353 187 370 o o o o o o o o 1,243 1,339 1,545 1,434 1,55 1 1,525 1,603 1,545 o o o o 1,265 1,495 1,600 1,598 o o o o 1,135 1,325 1,085 1,465 o o 3,646 3,763 o o 4,180 4,1 14 o o 2,805 2,572 2,481 2,563 o 2,854 o 3,89 1 o 3,46 1 o 4,374 o 1,763 o 2,095 o 2,085 o 2,819 NOTE: SLS=Supplemental Loans for Students Program; "Other" includes state, Veterans' Administration, and institutional aid. SOURCE: Data from 1990 National Postsecondary Student Aid Study; provided by Daniel Goldenberg. the federal programs. In part that is due to the incentive structure of the State Student Incentive Grant Program, a federal grant that requires state matching; however, most states match a much higher proportion of federal aid than is required. States direct funds to needy students, to perceived manpower needs, and to students with high academic performance, in the main. Thus, the states add their own regulations. It is possible that the level of state regulation has been minimized due to the high burden on institutions and on the states themselves from federal student aid regulation. Nonetheless, institutions must often resolve conflicts inherent in federal and state regulations. Further, the student recipient often perceives the layers of state and federal regulation as confusing or, worse, as a roadblock. For students at high-cost institutions, additional information requirements may be requested in order to ration scarce school resources. For example, a student whose parents are divorced encounters federal and state require- ments to provide information from his or her custodial parent and steppar- ent (if any), but, in addition, may be required to provide parallel data from his or her noncustodial parent and stepparent to meet the institution's needs.
12 QUALITY IN STUDENT FINANCIAL AID PROGRAMS TABLE 5-6 Profile of Aid Packages, Full-Time Undergraduates Average Dollars per Recipient, by Program Number of Students with Campus-Stafford and Aid Combination (000s) Fell GrantBasedSLS LoansOther 3,698 0000 1,206 0002,648 380 003,0670 353 002,7222,792 35 01,40000 107 01,48203,259 54 01,3522,9020 199 01,3962,8053,758 360 1,379000 471 1,427001,823 416 1,59702,8680 405 1,45902,5991,965 153 1,6211,21400 301 1,5701,40202,115 163 1,6301,1212,5140 336 1,5681,4802,5522,746 NOTE: SLS=Supplemental Loans for Students Program; "Other" includes state, Veterans' Administration, and institutional aid. SOURCE: Data from 1990 National Postsecondary Student Aid Study; provided by Daniel Goldenberg. Extensive and Ever-Changing Regulations and Policy A significant factor in the ability of financial aid administrators in institutions of higher education to perform their jobs thoughtfully, accu- rately, and compassionately on behalf of students is the extensive and ever- changing regulations and policy governing the student aid programs. Change is rapid and accountability is immediate. Regulations or statutes frequently go into effect within 30 days of being issued or are retroactive, and changes often take effect months before the information required to implement them is received. As a result, financial aid offices and officers must often func- tion with unclear directions. For this reason, the quality and timeliness of information on regulatory changes should be improved, including the inter- pretation of the regulations and guidance on how to establish administrative systems to implement the changes. The Higher Education Act, which authorizes funding for federal student financial assistance, is reauthorized every six years. If changes in the stu- dent aid programs were limited to a six-year cycle, the process could be
ALTERNATIVE PERSPECTIVES ON QUALITY 113 manageable and would allow administrators to use their expertise to per- form the best job for all students. However, the reality and ideal are very far apart. In the past 10 years, over 20 new responsibilities for student aid administrators have been enacted. Ryan (1988) lists the following examples: . verification of data used to award Pell grants, guaranteed student loans (GSLs), and all federal Campus-Based aid; increased Pelt program reporting and documentation; · implementation of a determination of satisfactory academic progress; multiple disbursements of Guaranteed Student Loans; implementation of the Supplemental Loans for Students and Parent Loans for Undergraduate Students programs; verification of citizenship; · documentation of independent student status; adherence to changes in the Federal Tax Reform Act, which affected scholarships and grants; new refund and repayment policies; and collection of Selective Service compliance and antidrug statements. Ryan indicates that these duties not only required changes in procedure at institutions, they also required reeducation of students and retraining of staff. In some cases, changes to computer systems and informational hand- outs were needed. Although a deadline of December 1 for any policy or regulation to go into effect for the next academic year (starts on July 1) was established during the 1986 reauthorization, the deadline is frequently ignored. Often, .. . . . · · rat . . ~1 ~ regulations are either deemed too slgnl~lcant to Delay or ouagelary savings take precedence over the need to provide sufficient lead time for implemen- tation. As a result of concerns regarding extensive and ever-changing regu- lations and policy, the 1992 reauthorization act requires a master calendar for financial aid rules and that the Department of Education engage in negotiated rule making. Another issue raised by ever-changing regulations and policy is the inappropriate use of common-cause regulations to effect special-cause events.6 Over the years, added layers of regulations have been placed on students and institutions as a whole as a method of achieving compliance from small segments of the student and institutional population. Whether it is worth burdening all to ensure the compliance of a few is questionable. Common-cause regulations do not adequately deter schools with unde- sirable outcomes nor reward those that exhibit good management of federal programs. Guidelines are not designed with the intent of improving the Recall the discussion of common-cause and special-cause problems in Chapter 2.
4 QUALITY IN STUDENT FINANCIAL AID PROGRAMS administration of financial aid programs by setting performance standards; rather, penalties are assessed for noncompliance as an attempt to motivate good performance. Financial aid offices struggle to keep up with the burdensome adminis- trative requirements. Administrative costs reportedly have risen faster than the overhead recovery rates allowed by federal regulation; consequently, institutions are forced to allocate resources to the financial aid office that could be spent on instructional purposes. This is particularly serious due to the significant financial crisis facing many, if not most, colleges. Only institutions that award a high proportion of their own funds, as contrasted with federal funds, appear able to invest in improvement. Some institutions believe that some loans, for example Perkins loans, are too cumbersome for their limited financial aid staff to administer and therefore they do not offer them. Burden and Error Given the complexity of the student financial aid system and the large number of players involved throughout the process, many concerns arise regarding administrative burden and error. How much burden is accept- able? Who should bear the burden? How real is the problem associated with error? A recent assessment of regulatory burden (Price Waterhouse, 1991) found consensus among all types of institutions regarding such issues as manda- tory in-person counseling and the complex determination of dependency status. Mandatory in-person counseling of all loan recipients upon entering and leaving the institution was found to be burdensome, unproven in reduc- ing defaults, and inappropriate for institutions with low default rates. Simi- larly, collecting documentation each year to prove that a student is indepen- dent, when the student has already proven independence in previous years seems to be an unnecessary burden, particularly given that the rules are difficult to administer. Other issues of concern agreed upon included the manual reporting and recordkeeping requirements associated with the financial aid transcript and Student Aid Report, and the burdensome need to collect numerous signed statements from the student regarding illegal use of drugs, educational pur- pose, and other items. To alleviate much of the burden associated with manually keeping records, Price Waterhouse (1991) proposed instituting the use of an electronic system and recommended that changes to specific stat- utes and regulations be evaluated for their potential to reduce unnecessary burden. When addressing the concerns related to obtaining signed documenta- tion from students, the issue of technical error arises. Technical errors, or
ALTERNATIVE PERSPECTIVES ON QUALITY 115 categorical errors as some studies refer to them, are a component of proce- dural error which occurs when an institution fails to adhere to established guidelines. As noted earlier in this chapter, when any one of the following documents, required by the Department of Education, is not present in a student's file, the student is considered to be categorically ineligible: finan- cial aid transcript, statement of educational purpose, statement of registra- tion for Selective Service, and documentation of independent status under unusual circumstances. These causes of error have the greatest impact on total error because when they occur, the entire amount of an award is con- sidered to be in error. Categorical errors account for about two-thirds of the institutional errors found in Department of Education commissioned stud- ies. Three reports General Accounting Office (1985), Advanced Technol- ogy and Westat (1987b), and Price Waterhouse (1990b) all found that al- though the frequency of categorical errors is low, the dollar impact was . . A. s~gn~cant. In the 1982-83 academic year, 3 percent of Pell grant recipients re- ceived awards in error due to a missing financial aid transcript, and the net estimated error equaled $95 million (General Accounting Office, 1985~. In the 1985-86 academic year, net marginal error due to a missing financial aid transcript was $41.2 million and affected 2 percent of Pell grant recipients; for GSL recipients, 1 percent had categorical errors totaling $142.8 million (Advanced Technology and Westat, 1987b). In the 1985-86 academic year, some form of categorical error occurred in 4 percent of Pell grant cases, which resulted in $114.2 million in institutional payment error. Estimates related to the error due to a missing financial aid transcript for the 1988-89 academic year by aid program are presented in Table 5-7. Although the dollars in error appear large, as a percentage of total dollars they are no more than the percentage of students with error. Since dollar error is nested TABLE 5-7 Pell Award Error Due To Missing Financial Aid Transcript, 1988-89 Academic Year Average Percent of Percent of Error per Recipients Dollars Recipient Program with Error in Error with Error Pell 0.4% 0.3% $ 875 Campus-Based 0.3% <0.05% $ 664 Stafford Loan (overcertification only) 0.2% 0.2% $ 3397 SOURCE: Price Waterhouse ( 1 990b).
6 QUALITY IN STUDENT FINANCIAL AID PROGRAMS within this small number of students, error reduction strategies still require finding those students in error. Further, once the appropriate documents are obtained and added to the student's file, the technical error is corrected and the award is no longer considered to be in error. While the Department of Education requires institutions to collect signed statements regarding illegal use of drugs and educational purpose, the certi- fication is not of direct use to the institution because it neither prevents drug use nor ensures that the student has a valid educational purpose. Since regulations such as these are driven by statute, Congress must take action if changes are to occur. Under the current system, institutions bear a consid- erable burden in obtaining and storing this type of certification. There is, however, no reason why the depository cannot be changed so that a central ized process developed by the Department of Education can take on an equitable amount of responsibility for federal requirements not related to the education of the student. (This action is discussed in Chapter 9.) In an effort to improve the existing system along the lines suggested above, Congress has made some progress in initiating changes. The 1992 reauthorization of the Higher Education Act calls for increased utilization of verification mechanisms in which student statements and supporting docu- mentation can be verified through record matching, by using either an auto- mated or other system. These provisions include requiring that data base matches with the Selective Service be made and that the Social Security Number of all aid recipients be verified. With this advancement toward electronic data transfer of student financial aid information, the burden im- posed by the collection of student certifications could be greatly reduced and the value returned for these activities could be increased. In general, technical errors are an administrative problem that can be solved by obtaining the appropriate documentation. Record matching is often an easy way to obtain the documentation. When this error is cor- rected, most student award amounts will not change (except in cases in which an award is incorrectly given to students who already have a bachelor's degree). Although technical error is not as severe in its impact on the intent of the programs as other types of error found in Title IV aid administration, it is a regulatory violation that troubles policymakers who are locked into an enforcement/penalty approach. Recommendation 5-11: The Department of Education should increase its efforts to remove unnecessary burden from students and institutions by further development of automated data matches whenever possible. The panel will expand on this recommendation in Chapter 9.
OCR for page 87
OCR for page 88
OCR for page 89
OCR for page 90
OCR for page 91
OCR for page 92
OCR for page 93
OCR for page 106
OCR for page 107
OCR for page 108
OCR for page 109
OCR for page 110
OCR for page 111
OCR for page 112
OCR for page 113
OCR for page 114
OCR for page 115
OCR for page 116
Representative terms from entire chapter: