National Academies Press: OpenBook

Quality in Student Financial Aid Programs: A New Approach (1993)

Chapter: 8 A New Strategy for the Department of Education

« Previous: 7 Quality Control in Other Monetary Distribution Systems
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 149
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 150
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 151
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 152
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 153
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 154
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 155
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 156
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 157
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 158
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 159
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 160
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 161
Suggested Citation:"8 A New Strategy for the Department of Education." National Research Council. 1993. Quality in Student Financial Aid Programs: A New Approach. Washington, DC: The National Academies Press. doi: 10.17226/2226.
×
Page 162

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

8 A New Strategy for the Department of Education The Department of Education's current strategies for ensuring quality are characterized by · centralization of authority, · reliance on retrospective inspections to bring about quality, and · onerous sets of frequently changed regulations that focus on compli- ance with process requirements. Under the current strategies, the student financial aid delivery system is burdened with a multiplicity of process checkpoints, which add to the cost of running the system, for schools and for the government, and also contrib- ute to a tendency for the relationships between customers and suppliers in the system to be perceived as confrontational. Moreover, there has been an emphasis on meeting stringent national standards that fail to take into ac- count the often substantial differences among schools, in regard to their student profiles and missions. As Department of Education staff reported to the panel at its first meeting, the deficiencies of these strategies have led some within the department to seek new strategies. One of the difficulties in developing new strategies to improve quality is conceptual. In the absence of a statutory definition of quality, regulatory efforts fall into the trap of allowing no margin for error. As reported to the panel, the department has been operating with a "zero error" standard, which is unattainable in practice and therefore not effective in promoting quality improvements. Departmental staff reported to the panel on efforts to de- velop new strategies, which have been characterized by delegation of au 149

50 QUALITY IN STUDENT FINANCIAL AID PROGRAMS thority and responsibility, the "ownership" of performance measures by those who are performing the functions of the system, and a deregulation of the aid delivery process that empowers an institution to determine how best to accomplish program objectives. The new strategies, as found in the department's Institutional Quality Control (IQC) Pilot Project, focus accountability on results rather than process and emphasize the continuous collection of data useful for quality improvement at participating institutions. In place of the usual imposition of external quotas, these strategies are intended to form proactive partnerships rather than confrontational relationships and to focus on service improvements for the end customers. At the same time, they are intended to meet the need to assess overall system performance by measur- ing quality on an ongoing basis. THE INSTITUTIONAL QUALITY CONTROL PROJECT The panel was asked to study the Department of Education's IQC Pilot Project, a management experiment that seeks to implement a system of accountability at the institutional level to ensure quality performance in the administration of student financial aid. The panel's study consisted of a review of materials concerning the project (workbooks, instructional manu- als, three recently contracted studies, and an earlier study that was brought to the attention of the panel late in the panel's deliberations), discussions with departmental and contractor staff, and visits to financial aid offices. The Department of Education expressed interest in two main questions: · Is the IQC concept viable as an oversight strategy? · Is the IQC concept viable as an improvement strategy? Participation in the IQC project is voluntary and assumes that the com- mitment to ensure data quality already exists at the school. The aim of the program is to shift the responsibility for quality control from the federal level to the educational institution by providing assistance and the flexibil- ity to develop quality control programs that are tailored to the institution. With this experiment, the Department of Education hopes to reduce the burden on educational institutions, encourage the development of innova- tive management approaches, improve service to students, and reduce error in the delivery of Title IV financial assistance (Price Waterhouse, 1990a:1~. The IQC project was created by the Department of Education in 1985 and now involves some 80 institutions. Ideally, the institutions benefit from the opportunity to learn more about their processes by performing a detailed investigation on a random sample of their current student financial aid ap- plicants. While the department acknowledges that all institutions are not suited for inclusion in such a project, its long-term desire is to involve a

NEW STRATEGY FOR THE DEPARTMENT OF EDUCATION 151 group of institutions that would account for 60 percent or more of all fed- eral student aid funds. Those involved in developing the project in 1985 (volunteer postsecondary institutions, a steering committee, and other parties) agreed that shifting the responsibility for quality control from the federal level to the educa- tional institution would enable institutions to focus their resources on the factors that caused error at their respective institutions. The pilot ap- proach differed from the prevailing integrated verification approach to quality control, in which institutional actions were determined centrally through the application processing edits developed by the department (Price Waterhouse, 1990a). The IQC pilot schools must perform four major activities: 1. Conduct a management assessment review the procedures and practices of the financial aid office, assess internal controls, and identify enhanced management procedures. This step helps to build team commitment and to make the project more visible throughout the institution. 2. Measure error caused by students, institutions, and the delivery sys- tem select and review a random sample of student aid recipients to quan- tify the level of error and the major sources of error. A minimum of 203 students must be sampled, with the minimum rising with the number of aid recipients at the institution. The goal is to identify the types of errors that have the greatest impact on the accuracy of aid awards. 3. Identify corrective actions develop plans to reduce error, focusing primarily on the major error sources identified through error measurement. With this step, the institution establishes a process for identifying system- atically the causes of significant errors. 4. Monitor system status repeat the management assessment and error measurement activities annually to determine the effectiveness of prior man- agement enhancements and corrective actions. This review helps to deter- mine whether further procedural changes are needed. Institutions then re- vise the corrective actions as appropriate. Eligibility to participate in the IQC project is currently limited to the larger participants in Title IV programs who have shown low levels of financial liability and a commitment to ensure data quality. To be eligible in recent years, an institution had to have at least 2,000 recipients in the Pell and Campus-Based programs and combined program awards of at least $2 million. The institution must also participate in all five aid programs, and an assessment of less than $150,000 in audit liabilities is required during the two years prior to application for the program. Finally, an eli- gible institution must agree in writing to follow the IQC pilot program requirements.

52 QUALITY IN STUDENT FINANCIAL AID PROGRAMS Objectives of IQC The Department of Education recognizes that its role has been per- ceived by the educational institutions as strictly regulatory in nature. In- deed, the department has viewed it to be the job of the institutions to find the errors that are sure to exist in their student aid files. The program reviews done by the department and the required independent audits have traditionally offered little help to the institutions in terms of what they should be doing to improve their financial aid delivery systems. The IQC project was offered as a demonstration project to help the institutions deter- mine where problems lie in their systems and to encourage development of innovative management approaches. Of course, the accuracy of the Title IV student aid award process would improve if the discovery of filing and other errors led institutions to make changes in their own systems. Unfortu- nately, the panel found that many of the changes that are needed are outside the institutions' control or of little real consequence. The panel believes that the primary purpose of the IQC activities should be to conduct a management assessment (task 1 above), but it appears that the institutions focus on satisfying the requirements of the data gathering and monitoring activities (tasks 2, 3, and 4~. That is, the IQC project seems to force the institutions to focus initially on the results of the review of a sample of student aid recipients. The intent of the project is to help the institutions determine the types of errors that are inherent in their records so that they can work toward eliminating or reducing them in subsequent years. The methods for the sampling schemes and procedures described in the IQC workbook reflect statistically valid approaches, but the statistical knowl- edge available in most student financial aid offices is very limited, and the institutions are provided with few tools to help them determine either the causes for the errors or appropriate solutions. Thus, the panel found that, despite the periodic training programs offered by the department, the IQC institutions will not be able to utilize fully the instructions and software provided to them for statistical analysis of their sample data without further help. Incentives and Disincentives The Department of Education offers incentives to encourage institu- tions to participate in the IQC project. The incentive most often mentioned is that institutions are not required to verify applications as selected by the central processor, the task required on up to 30 percent of the total federal financial aid applicants at nonparticipating institutions. Recall that for non- participating institutions, the central processor identifies a number of appli- cations that are recommended for review, then the institutions select addi

NEW STRATEGY FOR THE DEPARTMENT OF EDUCATION 153 tional applications to ensure that at least 30 percent of the applications have been reviewed. For an institution with a large number of selected students, this process is labor intensive and not likely to provide valid information about how to change its activities, for two reasons. First, the review pro- cess must be cursory in nature for individual data items on the application because of the number of applications reviewed. Second, the nonrandom selection of the applications makes it difficult to project the results to other applications not selected. Although the incentive to reduce verification is in place, in practice many institutions in the pilot program apparently continue to verify applica- tions selected by the central processor. Also, if the department would want institutions with small numbers of aid students to participate in pilot activi- ties in the future, this incentive may not work. Such institutions will per- ceive that they have little to gain from participating in the project. For example, since the IQC sample size can be no fewer than 203 applicants, institutions that normally have fewer than 203 centrally selected cases for review will not benefit from this aspect of participation in the IQC. The Department of Education looks at retention statistics as an indica- tor of success in the pilot's development. Figure 8-l shows cohort retention of institutions accepted into and starting the pilot activities in each of the 50 45 40 35 30 25 20 15 10 Admitted Remaining as of 1/92 O 1 st 2nd 3rd 4th 5th ...., . i, ., ~ · - Cohort Group (year entered into project) FIGURE 8-1 Cohort retention, Institutional Quality Control Pilot Project. SOURCE: U.S. Department of Education.

154 QUALITY IN STUDENT FINANCIAL AID PROGRAMS first five years. This information leads the department to believe that after discounting the problems that one would expect in first-year activities (only 11 of the 42 institutions in the first cohort participated in the program in the fifth year), there has been steady improvement in acceptance of the pilot project by participants. For institutions in the first cohort there were no minimum size standards, thus providing an opportunity to evaluate the meth- odology of various sizes and types of institutions. Many of the institutions found that the burden of participation exceeded their commitment to partici- pate. In addition, the first cohort did not have the benefit of automated error calculation tools, which were not developed until mid-1986 (Price Waterhouse, 1990a). The panel requested additional data on year-to-year retention within each cohort (Table 8-1), which did not indicate that the likelihood of dropping out after one or two years in the pilot has been greatly reduced. Although long-term retention rates may not exceed 70 percent, the rates should not be a dominant consideration in judging the success of the pro- gram. More important are the benefits to the institution and the department's success in achieving its desired compliance objectives. Some institutional changes have been made as a result of information learned from the IQC sample, but panel members, in visits to institutions and discussions with aid officers, found that many if not most of the institutions were not sure how to proceed or even how to generate useful reports using the IQC software and that the amount of work required to complete the IQC process is generally greater than the institution had anticipated. Finally, the incentives for compliance may not be appropriate. Penalties imposed on those in the IQC sample with problem accounts may cause concerns at the institutions because the independent audits that the institutions are subjected to are not reduced even if they exhibit desirable results over time. TABLE 8-1 IQC Project Retention, by Cohort Number/Percentage Remaining Date Number After After After As of Cohort Admitted in Cohort 1 Year 2 Years 3 Years Jan. 1992 First Jan. 1985 42 11/26% Second Dec. 1986 22 19/86% 17/77% 15/68% 12/55% Third July 1988 23 17/74% 16/70% 14/61% 14/61% Fourth July 1989 20 18/90% 17/85% 17/85% Fifth Sept. 1991 27 26/96% Total 134 80/60% SOURCE: U.S. Department of Education.

NEW STRATEGY FOR THE DEPARTMENT OF EDUCATION Prior Studies of the IQC Project 155 Looking beyond retention in the program as an indicator of the value of the IQC project, the Department of Education has commissioned several studies of the effectiveness of the project (Pelavin, 1989, 1991a, b, and 1992; Price Waterhouse, 1990a). In this section the panel briefly critiques some of the methodology of the studies and comments on the most impor- tant findings on which panel members were in agreement. Pelavin (1991a) looked at the 19 institutions that had data for the 1987- 88 through 1989-90 program years. Point estimates were provided for "er- rors" in terms of percentages and dollars but standard errors were not pre- sented. Comparisons were made between 1987-88 and 1989-90, and the 1988-89 data were not included because the department believed that im- provements would take two years to become evident. The follow-up report (Pelavin, l991b) does provide some standard error information and 1988-89 data. This report found, as a whole, less reduction in error between 1988- 89 and 1989-90. The panel is concerned that the multiple comparisons made across institutions and programs may have overstated statistical sig- nificance since no adjustments were made to the variances to reflect the multiple comparisons. The reports conclude that the "IQC Project is show- ing improvements in the delivery of Title IV student aid, albeit slowly" (Pelavin, 1991a:7~. A major concern of the panel is that no attempt was made to determine the extent to which the error rate in all institutions, including those not participating in the IQC pilot, was declining (as indi- cated in larger studies). One would expect such results when program changes are made over the years. The Price Waterhouse (199Oa) report is commendable for looking at error rates at similar-sized, nonpilot schools and relating those data to esti- mates of national error rates. The study looked at changes in the percent of dollars in error at institutions participating in the IQC pilot during the 1987- 88 and 1988-89 award years, but it made no estimate of standard errors for these differences. Most of the percentage differences were less than 2 percent, which concerned the panel because some subgroup comparisons surely had too large a sampling error to infer that changes had occurred since there was a small sample (412) of applicant files studied. Still, the panel concurs with several of the recommendations made in the report: (1) reassess the effectiveness of the IQC pilot in reducing student error and (2) reassess the objectives of the IQC pilot and determine the extent to which it is meeting those objectives. There is little evidence in any of the studies that student error can be greatly reduced by pilot or additional verification activities, yet the poten- tial exists to reduce institutional errors through pilot activities. Of equal importance, pilot schools provide the possibility of early and repeated mea

56 QUALITY IN STUDENT FINANCIAL AID PROGRAMS surement of the system and any changes made to the system, and they provide the opportunity to test and evaluate innovative approaches to deliv- ery systems. In short, the pilot has more to offer than what appears to be the Department of Education's primary goal error reduction. Pelavin (1992) compared data for 32 schools for which information was available for 1988-89 through 1990-91. However, only point estimates for comparing 1988-89 and 1990-91 were provided. The panel believes that the methodology used to compute standard errors, t values, and degrees of freedom was incorrect. The report concluded that the data "strongly sup- port the precept that the IQC Project is significantly increasing accuracy in awarding and disbursing Federal Title IV student financial aid. The IQC Project a deregulated approach to quality improvement in the Title IV programs provides tthe department] with necessary accountability mea- sures, extends institutions needed flexibility, and produces quantitative re- sults that document and confirm its success" (p. 3~. The panel disagrees with the contention that the data "support" such broad conclusions. Despite the problems with the studies, the Department of Education should take note of the messages they convey: The schools need more help; student error is not likely to decline as a result of these activities; and the objectives of the pilot project should be rethought. Given the difficulty of using subgroup estimates from studies with such small sample sizes, the panel doubts that further such studies could provide more detailed informa- tion of sufficient quality to justify the substantial investment in terms of the cost to the department and the time of the institutions involved in providing the data. Any further analysis of the data from existing studies should use appropriate statistical tools and interpretations. Recommendations 8-1: The Department of Education should develop the capabilities needed to ensure that proper statistical methodology is used in future evaluative studies of the Institutional Quality Control project. In addition, future IQC activities should go beyond the empha- sis on reduction of verification error rates to improvement of a wide range of financial aid management tasks and investigation of other quality concerns, such as the complexity of the forms and the need to improve outreach activities. Support of the IQC Project the Department of Education has instituted many positive activities in support of IQC institutions, including training programs for new and con- tinuing institutions, providing software, developing recognition and awards processes, publishing a newsletter, holding state and regional meetings, and providing technical assistance (using departmental and contracted person

NEW STRATEGY FOR THE DEPARTMENT OF EDUCATION 157 eel). The IQC personnel within the department are continually updating their support mechanisms so they can offer more help to the institutions. The panel, based on discussions with financial aid office staff from various institutions, has some suggestions for improving those activities. Training Sessions The training programs are generally regarded by the institutions as good; the sessions are appropriate for the areas that are currently covered. Breakouts into special topic/interest sessions seem very worthwhile, and the agenda for those sessions seems well thought out and subject to improvement as new ideas are available. The emphasis now is on how to avoid making certain types of errors during the next cycle of financial aid applications. More emphasis should be placed on discovering ways to improve the insti- tutions' processes and systems. Quality Control Software The software used by all institutions in the project is developed by a Department of Education contractor. The panel's questions to the contrac- tor resulted in friendly and helpful service. The software was designed specifically for the original requirements of the project. Updates to the software have reflected changes in the law and other requirements. A manual provided by the contractor attempts to explain in detail how various parts of the software's output should be interpreted, including the calculation of various errors, although some of the descriptions of error types and reasons are confusing to some users. The manual describes the two types of data that can be used by the institution for verification: original data from the students' records or "best" data from an updated source, such as the quality control sample. The panel was informed of several issues regarding the software that the department should address, including inappropriate double counting of variances (in the accounting sense), the need to improve the procedure for updating the software and the timing for delivering updates to institutions, the fact that computer memory requirements often exceed the hardware capabilities in aid offices, the need for a help feature in the on- line access software, and the need to provide software to automate worksheets and quarterly reports. Recognition and Awards Although recognition and awards are often used in IQC pilot activities to encourage participation and commitment to new approaches, the recogni- tion and awards should be monitored carefully. A financial aid office that

58 QUALITY IN STUDENT FINANCIAL AID PROGRAMS merely adds more inspection or review activities in an attempt to "improve" the error rates without improving the process should not be encouraged. Recognition and awards should be given for substantial changes to a finan- cial aid process that result in noticeable improvements for the customer. Newsletter The newsletter has great potential for increasing communication and understanding among the various organizations participating in the IQC project. More use should be made of this vehicle to educate IQC institu- tions about the value of making real changes which improve processes. Newsletter materials should encourage other suitable institutions to join the IQC project. Articles prepared by participating institutions that describe the positive and negative aspects of participation in the IQC project would be useful. Articles on how the Office of Postsecondary Education is using the IQC results for its internal processes would increase the credibility of the project. Also, articles on the distinction between the uses of the IQC sample and the uses of the verification process would be helpful. Meetings Sponsoring national, regional, and state meetings is one of the impor- tant service functions of the Department of Education. At such meetings there is an opportunity to highlight the role of the IQC project and to showcase success stories. Although these meetings are not intended only for those in the IQC project, they could be used more to generate interest in the IQC program. Local and regional networks are an important way for people to learn and help each other, and they should be encouraged by the department, with departmental time and resources if necessary. Technical Assistance Participation in the IQC project will inevitably lead to questions and concerns. The department recognized this from the beginning and has at- tempted to provide assistance. Departmental staff involved in technical assistance efforts exhibit enthusiastic dedication to providing good assis- tance. However, the department must acknowledge that many institutions view it not as a partner in the process of improvement, but as a judge. This inhibits communication in an area where there should be mutual trust. In addition, departmental staff and contractors should be better informed about how the IQC process actually works. In the struggle to keep up with project demands, technical assistance personnel are not afforded enough time to gain hands-on experience with the process at an institution.

NEW STRATEGY FOR THE DEPARTMENT OF EDUCATION 159 Recommendation 8-2: The Department of Education should periodi- cally make complete revisions of the IQC software and instructional material, rather than engage in continuous patching, and it should seek assistance from software evaluation and improvement specialists and involve the users in redesign decisions. Recommendation 8-3: Institutions in the IQC pilot program should have more at-hand statistical support, and institutions volunteering to participate in IQC activities should be required to show access to sta- tistical expertise in sampling and data analysis as a condition of being in the program. To some extent this can be accomplished by improving the training of internal staff. Adding training material on basic statisti- cal skills to the training program could help, but more effective strate- gies might include subsidizing locally obtained coursework for finan- cial aid office staff or developing cooperative agreements with local quantitative departments to provide ongoing assistance. The Depart- ment of Education should assist the IQC institutions in these efforts by helping develop the cooperative ties and providing monetary support. Types of Errors Noted in the IQC Project The types and severity of the errors, including institutional and student errors, in the IQC process have been delineated in studies of the process and are reasonably well understood by all involved. However, the IQC project does not seem to provide any measures of improved service to students, families, institutions, the Department of Education, or the community at large. Errors created by the complexity of the process are, for the most part, ignored or attributed to the institutions or students. Errors caused, as discussed earlier, by the application form itself or by accompanying expla- nations about items on the form may contribute to a large share of the overall errors noted. Some aspects of the forms, although not necessarily causing an error, do cause frustration and confusion. In many cases opera- tional definitions of terms, such as "support," seem confusing and subject to interpretation by the department or the data entry contractors. Errors associated with such items as a missing financial aid transcript, missing Selective Service statement, and missing statement of satisfactory academic progress may not result in misspent aid dollars, yet they are counted as serious errors. Getting the quality control software to calculate the sum- mary tables without including such errors is at best awkward and perhaps impossible. Also, cut-off points for errors in declared assets and income seem somewhat arbitrary. Error types should be defined so that they are clearly understood by all involved. For example, errors that do not affect the amount of the aid

160 QUALITY IN STUDENT FINANCIAL AID PROGRAMS award should not be given equal weight with errors that do. Also, other measures that deal with improvements to the process should be added, and they could perhaps be determined by the institutions themselves. For ex- ample, the time it takes to process certain portions of the application might be a useful measure. SUMMARY OF THE PANEL'S REVIEW OF THE IQC PROJECT The panel examined the IQC project at great length and gave much consideration to its future. Should it be dropped, continued as a pilot program, or be incorporated as a basic approach to quality improvement? Does the IQC project release institutions from too few of the regulations to have any effect? Has it had a noticeable effect on any measures of im- provement, or is it impossible for structural changes that are limited to activities within the financial aid offices to result in noticeable improve- ment? The fact that there has been a high attrition rate in the project raises several questions. Are the institutions participating because they expect to make changes for the better, are seeking relief from the 30 percent verifica- tion process, or for other reasons? Does the incentive to join the IQC project come mostly from benefits offered by the Department of Education (external) or from the institutions' desire to seek improvement (internal)? What would happen to attrition if the department paid some of the adminis- trative costs of the program? What incentives might be used in addition to regulatory relief? Facts known about various types of incentives will influ- ence what the department does in the future to maintain and build the IQC process. Clearly, the incentive structure should be reviewed. Even though first-year schools are paired with a more experienced school, possibly the IQC process is overwhelming. Perhaps local assistance should be considered since many institutions will have quantitative staff who could learn more about the financial aid programs as they assist. Other institu- tions may have to ask for outside help. Statistical expertise should also be developed at the department to permit generation of the appropriate statisti- cal information about IQC institutions and the sharing of aggregate results. The measures of success should also be reviewed. As examples of possible measures of success, what effect has the IQC process had on de- fault rates or customer service? Institutions should target their own areas for improvement rather than use the Department of Education's numerical goals exclusively. The IQC project offers a chance for the Department of Education to be more proactive (compared with its usual reactive and regulatory mode), but the department may not be making the best use of the chance. For example, because there are strict admission standards, the schools selected for pilot

NEW STRATEGY FOR THE DEPARTMENT OF EDUCATION 161 activities could be allowed more freedom from regulatory and management burdens. The panel is concerned that the department may be more inter- ested in expanding pilot activities as a means of regulatory relief than it is in learning from pilot activities. Where regulations are a burden, the de- r~artment should address the Droblem bv improving the regulations so all institutions may benefit. Further, selection of small groups of the pilot institutions to participate in special studies should be considered. In addition, the design of the studies should be rethought in order to maximize what is learned from each study. This is especially important until the department better develops its internal analytic capabilities in these areas. Possibly, it could pretest the application form in a special study. Perhaps the training sessions could be used as a mechanism for obtaining suggestions from the institutions about improving the IQC process. Institutions in the pilot might show evidence of systemic problems, but they will not be able to correct such problems effec- tively. Systemic improvements will require fundamental changes to sys- tems, many of which will have to made by the Department of Education. The IQC project is a valuable source of information for the Department of Education. While it may help schools to lower their error rates, that is but one role. The department must remain aware of the two-track system that the pilot creates. The entrance requirements and commitment needed to participate are likely limiting participation to the most quality-conscious schools, and surely the largest. This limits the use of data for inference about the progress of quality improvement. Will policies found useful in these special schools be as useful if tried in nonparticipating schools? Wouldn't the philosophy of risk-based management encourage selection of the worst of the nonparticipants as pilot possibilities so issues of quality related to the larger universe of schools might be addressed? These issues lead to a fundamental issue that should be addressed by the pilot activities. The pilot project should make data available from which to infer "common-cause" problems so that the department can consider system improvements and determine what works and what does not work when innovations are at- tempted. Recommendation 8-4: The IQC program should be retained and given the support of top management. However, when the common-cause regulatory burdens imposed by audits and reviews are eliminated (for details see Chapters 4 and 55), the incentives to join the pilot project will change. Thus, the following improvements should be considered: · Incentives for institutions to join the IQC should be reexamined with the aim of encouraging schools with known administrative difficul- ties to seek help from Department of Education staff. · The department's measure of success in the IQC program should

62 QUALITY IN STUDENT FINANCIAL AID PROGRAMS be reexamined with the aim of going beyond measuring retention in the IQC program to identifying ways to improve processes. · More use should be made of IQC data. For example, the Depart- ment of Education should inform all institutions about the progress of the IQC program and suggest use of measurements and practices found to be successful in addressing specific concerns about quality.

Next: 9 Further Initiatives for Improvement in the System »
Quality in Student Financial Aid Programs: A New Approach Get This Book
×
Buy Paperback | $45.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Federal financial aid for postsecondary education students involves both large expenditures and a complex distribution system. The accuracy of the needs-based award process and the system of accountability required of the 8,000 institutional participants are the focus of this book. It assesses the current measures of system quality and possible alternatives, such as a total quality management approach. The analysis covers steps to eliminate sources of error—by reducing the complexity of the application form, for example. The volume discusses the potential for a risk-based approach for verification of applicant-supplied information and for audit and program reviews of institutions.

This examination of the interrelationships among the aid award and quality control activities will be of interest to anyone searching for a more efficient aid system. The book can also serve as a case study for other government agencies seeking to examine operations using modern quality management principles.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!