4
Reconciling the Access, Privacy, and Confidentiality of Education Data

This chapter discusses models that reconcile research access to education records with the confidentiality requirements of the Family Educational Rights and Privacy Act (FERPA). The first two sections of the chapter describe well-established models, in North Carolina and Florida, that allow researchers to access and analyze state longitudinal data. The next section presents an emerging long-term research partnership that permits University of Illinois at Urbana-Champaign researchers to access and analyze longitudinal data from schools and school districts. The final model is a new collaboration in Michigan.

NORTH CAROLINA EDUCATION RESEARCH DATA CENTER

Helen Ladd provided an overview of the North Carolina Education Research Data Center, which she described as “one of the most productive collaborations between a state department of education and researchers.”

The data center was originally created in 2001 through a memorandum of agreement between the North Carolina Department of Public Instruction and a consortium of researchers at the University of North Carolina at Chapel Hill, and Duke University. The data center is housed at Duke University’s Sanford Institute of Public Policy.1

Today’s center had its origins in 2000, when researchers at the two



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 37
4 Reconciling the Access, Privacy, and Confidentiality of Education Data This chapter discusses models that reconcile research access to edu- cation records with the confidentiality requirements of the Family Edu- cational Rights and Privacy Act (FERPA). The first two sections of the chapter describe well-established models, in North Carolina and Florida, that allow researchers to access and analyze state longitudinal data. The next section presents an emerging long-term research partnership that permits University of Illinois at Urbana-Champaign researchers to access and analyze longitudinal data from schools and school districts. The final model is a new collaboration in Michigan. NORTH CAROLINA EDuCATION RESEARCH DATA CENTER Helen Ladd provided an overview of the North Carolina Educa- tion Research Data Center, which she described as “one of the most productive collaborations between a state department of education and researchers.” The data center was originally created in 2001 through a memoran- dum of agreement between the North Carolina Department of Public Instruction and a consortium of researchers at the University of North Carolina at Chapel Hill, and Duke University. The data center is housed at Duke University’s Sanford Institute of Public Policy.1 Today’s center had its origins in 2000, when researchers at the two 1 See http://www.pubpol.duke.edu/centers/child/ep/nceddatacenter/index.html. 

OCR for page 37
8 PROTECTING STUDENT RECORDS universities were developing a joint proposal for research into the black- white achievement gap. Realizing they would need large amounts of data, they began discussions with the Department of Public Instruction. The group succeeded in obtaining funding from the Spencer Foundation for two initial studies, for the creation of the data center, and for a colloquium series designed to bring researchers using the data together with policy makers. Although the colloquium series helped to forge ties between state education officials and the university partners, Ladd said, the research- ers always emphasized to the state leaders that any research conducted through the center would be independent from the state. Following the signing of the memorandum of understanding in 2001, the Spencer Foun- dation provided additional support for the data center in 2003, and the memorandum of understanding was updated in 2006. Ladd described the center as a four-way partnership, including the state department of public instruction, the two universities, and the Spencer Foundation (Muschkin and Ladd, 2008). Ladd explained that the center was initially established to assemble data from the Department of Public Instruction for two purposes—to enable the specific studies of the minority achievement gap funded by the Spencer Foundation and to make the data available to the wider research community. Both purposes have been more than achieved. To date, 93 studies have received data through the center, including 21 proj- ects headed by researchers outside North Carolina. Ladd argued that the data center’s greatest accomplishment has been to overcome barriers to research using the state’s education data. In addi- tion to barriers related to compliance with FERPA, the Department of Public Instruction stored administrative data in a format that researchers could not use, and lacked resources to link data on teacher characteristics with data on student achievement, or to create longitudinally matched data over time. The data center has overcome each of these barriers, by encrypting the data to maintain confidentiality, checking the data for consistency and accuracy, writing user-friendly code books, merging data across sets (e.g., students with teachers and longitudinally over time), and meeting with researchers to explain what data are and are not available. The data cen- ter is populated almost entirely with data from the Department of Public Instruction and does not include data from other state agencies. These data are at the district, school, teacher, and student levels. Although some student and teacher records are matched over time, and some teacher and student data are linked, most matching is done by researchers. Returning to the issue of confidentiality, Ladd said that the memoran- dum of understanding is very clear on this subject, referring to FERPA and also to the state board of education policy manual. In this memorandum,

OCR for page 37
 RECONCILING EDUCATION DATA the state interprets broadly the FERPA provision that exempts disclo- sure of education records from requirements for informed consent if the records are to be used for studies “to improve instruction” (see Chapter 2). The memorandum governs the process through which the state data are made available. Initially, the data with all identifying information are maintained on a secure server at the Department of Public Instruction. No more than three members of the data center staff—who have been trained in the confidentiality requirements of FERPA and have signed confiden- tiality agreements—see the data in this form. They encrypt the original data, removing all direct and many indirect identifiers and adding new randomly assigned identifiers. When new data become available from the Department of Public Instruction, the unique encrypted data center iden- tifier is used to link the new record to existing data, and new data center identifiers are assigned to unmatched records. As a result of this process, Ladd said, “I would never see any data with identifiers on it. There are firewalls all around that.” The data center makes these deidentified data available only to researchers employed at a higher education institution or other research organization that has an institutional review board—excluding journal- ists, advocacy groups, and other organizations that lack these procedures. The data are also available to graduate students who provide letters of support from their faculty advisers, in which the advisers assume respon- sibility for maintaining confidentiality. To access the data, an investigator sends a research proposal for review by the director and the associate director of the data center. If the proposal is approved, the researcher is required to sign an agreement that includes guarantees of confidentiality and also specifies that the research findings will be shared with both the data center and the Department of Public Instruction. Ladd explained that the data center process has yielded many ben- efits for North Carolina’s Department of Public Instruction, in the form of useful studies. For example, Bifulco and Ladd (2007) found that students make considerably smaller achievement gains in charter schools than they would have in public schools, and that charter schools have increased racial segregation and minority achievement gaps. Another study, by researchers at Duke University’s Nicholas School of the Environment and Earth Sciences, showed that blood lead levels in early childhood are related to educational achievement in early elementary school (Miranda et al., 2007). State education officials are also very interested in a study that found that the incidence of problem behavior was significantly higher among sixth graders attending middle schools than among sixth graders attending elementary schools (Cook et al., 2007). Ladd predicted that further benefits to the state of North Carolina would emerge from a new collaboration between Duke University fac-

OCR for page 37
0 PROTECTING STUDENT RECORDS ulty and researchers at five other universities. These researchers have access to comparable administrative data from other states, including Texas and Florida. Financed by the U.S. Department of Education through the Center for the Analysis of Longitudinal Data in Education Research (CALDER; see Chapter 3), the collaboration facilitates the replication of analyses across states and shared learning among education researchers. In addition to this useful research, the data center also generates more direct benefits to the Department of Public Instruction, including providing full access to the cleaned and linked data sets generated by the data center. For example, department officials frequently use data center files on student disciplinary infractions, since they are more easily linkable to student demographic and academic information than the data files maintained by the department. In addition, when the department contracts with other state agencies to carry out research, the data center supplies files to these agencies at little or no cost. Over time, the depart- ment’s internal research capacity and productivity have both risen. The presence of the data center also allows the department to refer requests for data from outside researchers to the center, eliminating the costs of responding. Ladd emphasized that the research community’s data needs differ from those of the Department of Public Instruction. Researchers value longitudinal data over a long time period, in order to estimate models of educational activities and outcomes. In contrast, the Department of Public Instruction sometimes needs the most recent data and data with identi- fiers. Because of these differences, it is important that the department continue to develop its own data management capacity, Ladd said. Not- ing that the department had received a grant from the U.S. Department of Education to develop a longitudinal data set, she said it was important that the department move forward on this project internally, even though the data center has been working to create a longitudinal data set. She said that the data center has already held useful conversations with the state about this project, predicting that the evolving relationship between the state and the data center will be productive. Ladd observed that it would take the department two or three years to develop its longitudinal data set and that, unlike the data center’s data set, it would not include historical data. Finally, Ladd noted that the data center and the Department of Public Instruction share the goal of expanding the developing K-12 longitudinal systems to include data on postsecondary education. Through its part- nership with the department, which has begun discussions with the state higher education system, the data center hopes to establish the necessary mechanisms for sharing data, ensuring confidentiality, and providing

OCR for page 37
 RECONCILING EDUCATION DATA researcher access to information that would promote policy-enhancing research at all levels of education. In response to a question about the memorandum of understanding, Ladd said that it was helpful during the negotiations to explain to the North Carolina officials that Texas had a similar arrangement to share data with researchers at that time (see Chapter 3). Robert Boruch called for increased sharing of memoranda of understanding to reduce the burden of negotiating each agreement separately and to encourage more states to develop and share data sets. FLORIDA’S EDuCATION DATABASE Jeff Sellers (Florida Department of Education) provided an overview of Florida’s comprehensive data system. He explained that the state developed the Florida Education and Training Placement Information Program in the 1990s, in order to assess the effectiveness of state educa- tional programs. The system used social security numbers to link indi- vidual student records at all levels of education and training to other state data sets on employment and wages, public assistance, incarceration in the state prisons, military enlistments, and other life activities. Analysts used the linked data to answer questions about what happens to students after they graduate from or leave educational programs; for example, one study examined the relationship between high school test scores and receipt of public assistance. Lessons learned in developing this program were applied to develop a central state data repository in 2000, using more current technology. This central data warehouse is administered by the Florida Department of Education but relies on administrative data from a variety of state and federal agencies. It includes individual assessment results from prekin- dergarten through community college, including scores on teacher certifi- cation assessments, as well as other individual student and staff data from all levels of education, prekindergarten through state university. Since 2004, state analysts have drawn on the warehouse to create several prod- ucts. Emphasizing that state officials and systems analysts learned as they developed the system, Sellers noted that one of the first products, a series of research extracts, was created in partnership with outside researchers who reviewed the data sets and provided feedback on their accuracy and usefulness. Local researchers from Florida State University also helped the state analysts to study and understand the longitudinal data they had assembled. The next product was a series of data marts, which are made up of data files aggregated by subject, such as enrollments or assess- ments. They are designed to provide increased access to the data, while protecting confidentiality in compliance with FERPA, Sellers explained.

OCR for page 37
 PROTECTING STUDENT RECORDS Aggregating these subsets of the huge volume of records speeds system performance and allows faster responses to specific queries. Sellers described the state’s approach to making the data anonymous. As the data are extracted from the various sources (assessment data, preK- 12 student data, etc.), they take two different paths (see Figure 4-1). Data on enrollments, assessments, attendance, and the like follow one path into the warehouse, and student and teacher data follow another path, in which identifiers (social security number, name, birthdate, and other student and teacher identifiers) are stripped off. The data enter a “black box” in which each individual or institution is assigned a unique student or teacher identification code, known as a “data warehouse internal iden- tifier.” This anonymous identifier is then relinked with the data that fol- lowed the other path, creating a new, anonymous record, which, in turn, is loaded into the data warehouse for storage. Sellers emphasized that this process of matching a new anonymous identifier to each student or teacher is completed only once, mentioning that the group uses a similar technique to make school information anonymous. Data warehouse man- agers ensure that each unique individual or institution retains the origi- nally assigned identifier throughout the loading and extraction of data. Matching Individuals in Florida’s Education Data Warehouse: Anonymizing Data Data warehouse ble internal identifier ia ntif de al I ta son Da r Pe Matching Process Mainframe Systems Transform & load Data for load in the warehouse Data warehouse processes Other Systems FIguRE 4-1 Florida’s approach to making data anonymous. SOURCE: Sellers (2008).

OCR for page 37
 RECONCILING EDUCATION DATA The integrated longitudinal data in the warehouse are useful for many different purposes. Administrators in the Florida Department of Educa- tion use the data to inform decisions about funding, class size planning, and other matters and to meet the federal accountability and reporting requirements. The data marts produced over the past two years, which report and present the data, have been used to provide feedback reports to high schools and community colleges. In response to state legislators’ questions, analysts are creating a data mart focusing on the effectiveness of the variety of state-funded teacher preparation and certification pro- grams (including alternative certification programs). Using an approach similar to the one Susanna Loeb used in New York, the state analysts link teacher identifiers with student identifiers and also look at the teacher’s educational record when she or he was a university student, to assess the effectiveness of alternative preparation programs based on student performance. The data are also useful for research, Sellers said. The state of Florida has established research partnerships with the Center for the Analysis of Longitudinal Data in Educational Research (CALDER; see Chapter 3) and has collaborated with the Community College Research Center described by Thomas Bailey to study community college financial aid and high school–community college dual enrollment programs (see Chapter 3). Sellers concluded his presentation by highlighting key lessons about what has worked in Florida: • o the extent possible, build on existing systems and expertise. T Sellers observed that many states have administrative data, but the key question is how to leverage these data to study and inform policy. For the 27 states that have received federal grants to build longitudinal data systems, he said, the challenge will be linking the data so that students can be followed over time in school and beyond, as they enter postsecondary education or the workplace, receive public assistance, or have other life experiences. • ursue opportunities to provide service and share information. P Sellers noted that he usually delivers presentations to “people who have the data, not people who want to use it.” In his talks, he encourages the data managers to respond to outside researchers’ requests by considering how the state may leverage the request. Because his agency lacks the resources needed to evaluate current or proposed future education policies, outside research can be very valuable. At times, state officials will ask the researcher to slightly modify the research plan or add a component related to a specific question in order to gain more from the research. Sellers suggested that this could be a selling point for researchers as they approach

OCR for page 37
 PROTECTING STUDENT RECORDS states looking for data. The researchers could propose to help the state evaluate some of its programs and policies in exchange for access to the state’s data. • xceed all requirements dealing with confidentiality and restricted E release. This is important not only because of the legal require- ments in FERPA, but also to address how the public, parents, and the media perceive the collection and use of individual data and to counter possible future references to the state government as Big Brother. • rom a development perspective, it is really never over. The answer F to any single question about education policy often raises five new questions that need to be addressed. In discussion, Sellers said that his agency does not have enough staff to respond to all requests for access to the database. He observed that the queue of researchers seeking access was growing longer. In the future, he said, the state would like to create a center similar to the North Carolina Education Research Data Center—a “virtual sandbox.” Ideally, the state could give the key to the sandbox to a qualified, approved researcher, moving to a self-service model that would eliminate the need to respond to each individual request. DEVELOPINg LONg-TERM RESEARCH PARTNERSHIPS IN ILLINOIS Lizanne DeStefano (University of Illinois at Urbana-Champaign) explained that she been involved in balancing researchers’ need for access to education data with protections for individual privacy and confidential- ity for over two decades, including many years as chair of the university’s institutional review board and in her current position as associate dean for educational research. Observing that “we live in interesting times,” DeSte- fano argued that longitudinal studies of individual student performance over time are critical for responding to the accountability requirements of the No Child Left Behind Act. Because of this, she said, local and state education agencies are now more motivated to find solutions that enable research while also protecting student confidentiality. DeStefano outlined different phases in the relationship between Uni- versity of Illinois at Urbana-Champaign education researchers and local school districts. In the early 1990s, local schools and districts became increasingly unwilling to respond to many different ad hoc requests for data from university researchers. They viewed researchers as people who came into the schools, took data, and left without providing anything of value to the school or district. In addition, these small schools and dis-

OCR for page 37
 RECONCILING EDUCATION DATA tricts often lacked the time and staff to respond to individual requests that they create anonymous data sets. To overcome this problem, the university created the Office of School- University Research Relations, a single point of contact for researchers and schools. School officials can call this office to receive assurance that any research proposed for or under way in their schools has been approved by the university’s institutional review board and that the researchers involved have undergone criminal background checks and received train- ing in research ethics and procedures. School officials can also ask ques- tions about the research and its findings and implications for school policy. The new office was successful for many years, facilitating about 150 research projects in schools each semester. However, in 2004-2005, as school officials grew concerned that spend- ing time with outside researchers was distracting them from improving instruction in order to meet the requirements of the No Child Left Behind Act, they again grew reluctant to sponsor research. At the same time, responding to the federal requirements had increased their awareness of value of data analysis, evaluation, and research. These changes led the university and the school districts to a new phase in their relationships, establishing long-term research partnerships based on common interests and a shared commitment to school improvement. The new partnerships involve many different school districts in the Urbana-Champaign area and are supported by a new Center for Education in Small Urban Com- munities.2 DeStefano noted that the university has not yet developed a long-term research partnership with the state of Illinois and only recently signed the first memorandum of understanding governing access to state data. Although there are many research partnerships, each includes several common elements. First, it is based on a negotiated long-term research agenda, developed through frank discussions among university adminis- trators and faculty and school district representatives. DeStefano observed that, although she had initially feared that faculty interests would differ from the districts’ interests, she found quite a bit of overlap. These dis- cussions and negotiations led to a list of four areas in which research is critically needed, and the university provides funding and fellowships to encourage faculty to conduct studies in these four areas. In addition, the university commits to sharing the research findings in a form that the school or school district can easily use and apply. Second, each partnership deploys similar strategies for informing par- ents and students about research activities. The partnerships use “robust and effective” procedures for disseminating information and obtaining 2 See http://www.ed.uiuc.edu/smallurban/.

OCR for page 37
 PROTECTING STUDENT RECORDS parental consent, when consent is required, DeStefano said. For example, parents can view survey instruments, protocols, and research summaries, and they can also call a toll-free number if they have a concern or question about research under way in their schools. Third, each partnership is supported by cross-training of school dis- trict personnel, families, and researchers on research ethics and com- pliance with FERPA. This training develops shared understanding and undergirds the fourth element of the partnerships—formal data-sharing agreements specifying that the school districts will maintain and allow regular access to deidentified longitudinal data sets in specific areas. These formal agreements include common interpretations of FERPA and options for compliance. For example, when informed consent is required for disclosure of education records, the relationships the university rou- tinely develops with families make it relatively easy to obtain signed consent forms. More often, the school district is allowed to disclose dei- dentified data without informed consent under the FERPA exception for “organizations conducting studies for, or on behalf of, educational agencies or institutions for the purpose of . . . improving instruction” (see Chapter 2). In addition, a memorandum of understanding with each school or district partner allows the university researcher and the district to obtain approval for the research from the University of Illinois at Urbana- Champaign institutional review board, which has been expanded to include representatives of the school districts. No approvals from other institutional review boards are required, saving both the researchers and the school districts time and money. The final element of each partnership includes measures to strengthen the research capacity of the school district. The university provides resources to strengthen district information offices, including technology, expertise in encryption and security, shared servers, and student interns with background in information technology. In conclusion, DeStefano said that, although some faculty members had been concerned that the efforts to develop new research partnerships would constrain their research agendas, this has not happened. At first, the negotiated research projects were very focused on students’ performance in reading and mathematics, because this is what the schools wanted. However, after three years of experience, including routine meetings with faculty members, school officials now recognize the value of broader research, including investigations of social, emotional, and behavioral questions. For example, one current study is investigating student health and obesity. Responding to a question, DeStefano explained that school representatives agreed to a broader research agenda partly because they had received a good payoff from the original projects focusing on analysis of reading and mathematics performance, including “tables and graphs

OCR for page 37
 RECONCILING EDUCATION DATA and charts and reports that they could have never generated.” She empha- sized that university researchers routinely share reports and studies with school officials. In response to another question, she said that, with a few exceptions, the university does not provide financial or other incentives for parents to sign informed consent forms. Instead, the university aims to educate and inform parents about research going on in their children’s schools, through a newsletter, a series of public forums, an annual conference cohosted by the university and the school districts, and the toll-free num- ber mentioned earlier. As a result of these efforts, when a student brings home an informed consent agreement, the parents are more likely to be aware of the research and are more likely to read and sign the agreement. Over the past five years, the response rate when researchers send out consent agreements has increased. A NEW COLLABORATION FOR DATA SHARINg IN MICHIgAN Barbara Schneider began by identifying several differences between her new model of collaboration in Michigan and the North Carolina Edu- cation Research Data Center. She explained that the Michigan collabora- tion was developed much more recently, through a subcontract with the Regional Education Laboratory-Midwest, which in turn is funded by the U.S. Department of Education’s Institute of Education Sciences. The sub- contract has two goals (Schneider, 2008): 1. To demonstrate the feasibility of assisting state education agencies in leveraging existing state administrative record data to provide an empirical basis for developing education policies and practices and 2. To provide technical assistance to the state of Michigan, addressing questions initiated by the state—and to document and publicize the technical assistance process as well as any unique analytic problems in working with the state administrative records. One continuing challenge, Schneider said, is that in Michigan, the Cen- ter for Educational Performance and Information (a unit of the state bud- get office) collects and maintains education data, rather than the Depart- ment of Education. Schneider said that different offices in the center allow her research team (including faculty, postdoctoral fellows, and Michigan State University graduate students) to access and analyze administrative data files in response to questions they jointly identify. These questions and their potential answers are constructed to inform state education decision making and have potential budgetary consequences.

OCR for page 37
8 PROTECTING STUDENT RECORDS The research team has published three reports, including a back- ground survey and two technical reports to the state. The survey yielded responses from education officials in seven Midwestern states about the types of data and data analysis they would find most useful (McDonald et al., 2007). The researchers learned that state education policy leaders are very interested in developing longitudinal data systems with linked student, teacher, and school data. However, most states lack the person- nel and the capacity to develop such systems at present, due to their very limited resources. Before discussing the team’s first technical report, Schneider described the context supporting the analysis. Because the demonstration project was specifically designed to make the collaborative process transparent, a member of the research team took notes at every meeting; all team activi- ties were documented in a historical file; and all data analyses, codes, and procedures were recorded. As required in the subcontract, the team iden- tified unique analytic problems of dealing with universal data, rather than a survey sample. For example, to analyze differences in data on 98,000 teachers, the team used gamma statistics, rather than the t-test commonly used to analyze differences in means between two sample groups. Schneider outlined three ground rules underlying the researchers’ relationship with the two Michigan agencies. First, the questions were to be generated by the agencies. Second, the analyses would be designed and conducted as an iterative process, which led to modifications at each meeting. Third, the research team would document the technical assis- tance process, to enable an evaluation of the potential replication of this model. Turning to the issue of confidentiality, Schneider explained that the research team is able to access the Michigan data under the FERPA pro- vision allowing disclosure without prior consent “to organizations con- ducting studies for or on behalf of educational agencies” and because the research team members are designated as agents of the state (see Chapter 2). In addition, the team members have obtained approval from the insti- tutional review boards at their respective universities and at the state level. In response to a question, Schneider said that confidentiality pro- tections are very important, and that she and other professors teach their graduate students this. For example, she requires her graduate students to practice following the procedures required by the National Center for Education Statistics’ licensing process (see Chapter 2). Schneider empha- sized the need to inform the research community about confidentiality protections. The team’s first analysis of Michigan data originated in a meeting that included the researchers and staff of the two state agencies, organized by Margaret Ropp, director of the Center for Educational Performance and

OCR for page 37
 RECONCILING EDUCATION DATA Information. The purpose of the meeting was to discuss priority issues that could be addressed through analysis of the existing state data. In preparation for the meeting, Schneider’s team created a series of table shells illustrating the types of analyses that could be conducted using the state’s data. When they saw these tables, she said, the state analysts “got really excited,” because, although they have doctorates and are very familiar with the data, they lacked time to think about questions that the data could answer. She observed that “everybody’s eyes lit up” and the meeting continued until 6:15 on a Friday afternoon. At the meeting, participants discussed Michigan’s recently approved “merit curriculum,” requiring high school students to complete four years of English and mathematics, three years of science and social studies, and two years of foreign language in order to graduate. The state superin- tendent of schools wanted to know whether schools across the state had enough qualified teachers to teach the required subjects. To address this question, the state gave the team access to a large file of teacher data. With the help of a postdoctoral student, Schneider was able to “unstack” the file into individual teacher records. When team members verified their individual records related to thousands of teachers against a Standard & Poor’s database of Michigan teachers, they found only a small discrep- ancy. The team then linked these records with other national and state data and analyzed the linked data sets. Schneider observed that, as the research team and agency personnel began to work together, they built trust. She said that all of the key ideas identified in her book on relational trust (Bryk and Schneider, 2002)— respect, competence, integrity, and working for the common good—were realized over the course of the project. For example, she promised to deliver a report within a month of the first meeting with state officials, in order to counter the view that research “takes forever.” To meet this com- mitment, the five team members worked daily to produce a draft, which they reviewed and discussed with agency officials. After revisions, the team delivered a final report. Schneider said that her graduate student, who was familiar with high school scheduling, developed a demand formula to answer the superin- tendent’s question about whether schools across the state had enough qualified teachers to deliver the merit curriculum. She said that several statisticians had described this formula as “the most simple, elegant way to figure out how many teachers you need to teach the merit curriculum in your school.” The formula is designed to adjust for changes in enrollment size, increases and decreases in class size, and changes in the number of courses teachers are required to cover. Applying the formula to the large teacher data set, the team found that only 14 schools had an undersupply of qualified teachers in all four required subjects, but, when considering

OCR for page 37
0 PROTECTING STUDENT RECORDS each subject and grade level, the undersupply of qualified teachers could potentially affect 72,000 students. The team’s findings on teacher supply and demand were summarized in a second technical report to the state (Keesler et al., 2008). The state officials welcomed the report, using it to target professional development courses and funding toward schools with an inadequate supply of quali- fied teachers. They have also raised new questions that the research team is currently addressing. At the same time, Schneider and Margaret Ropp, director of the state Center for Education Performance and Education, are sharing the report’s findings and methods, including the formula, throughout the Midwest region (Ropp et al., 2008). In conclusion, Schneider said that the benefits of the project include a trusting, open relationship with state personnel, encouraging collabora- tion across universities, openness and sharing of information, and work- ing “with a fabulous group of professionals.” The greatest challenge, how- ever, is that the state, on the basis of its interpretation of FERPA, has not provided access to linked longitudinal student and teacher data, as the research team originally requested in 2006. Schneider described FERPA as “the shield that stops us and the barrier from getting to the places where we want to be.” Finally, Schneider said that the project is at a much earlier stage than the databases allowing research access in Florida and North Carolina. The team is currently in the process of creating a research collaborative that will allow researchers across the state to access the teacher file and other files, and this has raised questions about who should warehouse the data. Currently, she said, the state of Michigan maintains the data, although both Michigan State University and the University of Michigan have proposed to warehouse the developing data sets. The team is also working on another technical report and collaborating with the state to study several new issues.