In the first item on the agenda, participants debated the value of green chemistry on the curriculum of the future and spoke about the challenges of rooting sustainability in both the curriculum and textbooks. The need to educate faculty, industrial scientists, and the general public was also discussed. As an introductory comment, Andrea Larson stated that the challenge lay in approaching education at all levels and emphasized the difficulty in understanding the scope and scale of the challenges facing the chemistry community.
Brad Allenby described education in sustainable chemistry within the larger context of the overall changes taking place in this field. Chemistry is shifting in meaning right now, he said. “We appear to be in the period of fundamental redefinition of much of the intellectual landscape, and I think that is important to bear in mind when we think about what we know and what we don’t know,” Allenby said.
Allenby believes the current education system fails to equip any student for the world in which they will be working and living. In particular, the topics of ethics and systems complexity are either missing or severely de-emphasized. In his opinion, no student should be permitted to graduate from any institution of higher learning without completing a course that equips him or her to think about complexity. This should not be a technical course that teaches modeling; instead, it should be designed to teach them how to intuitively consider complex systems and the limitations when working with them.
Many American students also obtain the best education in terms of the specifics of their disciplines. However, education at all levels currently lacks conceptual insight in sustainability and green chemistry. The curricula, teaching materials, and emphasis on multi-disciplinary programs that cover not only technical competence but also the social and environmental dimensions of green chemistry are missing. Allenby stressed that there is a need for the ability to understand large-scale systems at an appropriate scale, especially since this is not something that an individual scientist or firm will do effectively. “We need an institutional basis to maintain a dialogue with these systems so that, for example, when the atmosphere begins to display strange chemistry based on a very, very small percentage of CFCs, we are able to respond,” Allenby said.
There is also a need for an appropriate prioritization of values, ethics, and goals. Opinions differ on the values placed upon different aspects of green chemistry. “If I am working in a factory and you find a way to substitute for a carcinogen that I am being exposed to, then I am going to like that. I may not care too much if that has impacts down the line on ecosystems,” Allenby explained. These problems are currently being solved on an individual scale. The policy structure at the moment encourages the imposition of individual values, an adversarial process. A single set of values applied to difficult questions will most likely be inadequate, which the field of chemistry needs to move beyond. To achieve this, institutional capability must exist. However, Allenby warned that there might not be any easy solutions. “I think that we need to appreciate the complexity of what we are doing and begin to develop tools that allow us to do better in the short run, while we are working on evolving the institutions we need in the longer run,” he said.
Mary Kirchhoff of the American Chemical Society (ACS) looked at some trends in green chemistry education. The ACS Green Chemistry Institute is a strong advocate for green chemistry education. Other voices are also joining this call, which helps to build the case for increased education in this area. “Right now, there are a few champions who are very passionate about what they are doing, believe very strongly in green chemistry, but it is not across the board, and that is really where we have to keep working,” she said.
Schools with green chemistry courses include Carnegie Mellon University, Davidson College (North Carolina), and Hendrix College. “You are not limited by the size of your institution, if you want to integrate green chemistry into the curriculum,” Kirchoff said. The green chemistry lab at the University of Oregon offers the most comprehensive approach she has seen. All undergraduates who take organic chemistry are exposed to a green chemistry approach in the lab. The University of Massachusetts has instituted a Ph.D. program in green chemistry. At the University of
Scranton, Michael Cann has developed a number of online modules that are easily accessible for use in different categories, such as in physical chemistry, general chemistry, and organic chemistry. However, sustainability education must move beyond four-year-colleges. For example, community colleges tend to be overlooked in the educational picture. In a critical move, the University of Oregon has partnered with the local community college to encourage transfers from two-year to four-year colleges. When these students move to the four-year schools, educators want them to have been exposed to green chemistry and sustainability concepts.
Educational textbooks are also devoid of green chemistry, sustainability, or many of the related topics. General Chemistry, Brown and LeMay’s most recent edition, contains five pages on green chemistry within its “chemistry and the environment” chapter. Zumdahl1 has a sidebar on green chemistry that describes the use of CO2 for dry cleaning. Kirchoff said that these are steps in the right direction, but educators tend to skip sidebars in an effort to get through an overly ambitious syllabus. Many interesting topics, especially the more modern research areas, tend to be in sidebars and side boxes. As a result, they do not get covered in the main body of the course. However, Organic Chemistry is very encouraging; Solomon’s most recent edition has five different green chemistry examples embedded in the text. Overall, green chemistry is starting to creep into mainstream textbooks. “This is where I think we really need to be focusing our efforts, if we want to see a lot of students impacted by green chemistry,” Kirchhoff said. In addition, the subjects of toxicity and toxicology should receive more appropriate attention. Usually, the LD50—the “lethal dose” that kills 50 percent of a group of test animals exposed to a material—is the only toxicity or toxicology topic covered. Occasionally, the textbook will refer to poisons and cover the alkaloids and poison dart frogs. There is room for improvement to incorporate these subjects into educational material.
Lab lectures should also incorporate more information on sustainability. Even in her own teaching experiences, Kirchhoff only provides technical information about chemicals, such as whether a chemical is hazardous or toxic, when to use it in a hood, or where MSDS sheets are located if students want to look at them. “We don’t have a culture of emphasizing green chemistry topics or related topics like toxicology,” she said.
In addition to the lack of educational materials and the over-crowded chemistry education curriculum, the perceived lack of rigor of green chemistry and sustainability is another barrier. It is challenging to conduct and teach green chemistry, because the easy reactions, which involve the use of hazardous materials at high temperatures and pressures, have already been identified. Inertia is also another challenge. Today’s educators have not been trained in green chemistry; it was not part of the curriculum when many of today’s working chemists and chemical engineers were in undergraduate and graduate school. It is a challenge to get over this mind set.
What, then, are the available resources? Paul Anastas and John Warner came out with their pivotal work, Theory and Practice, in 1998. Since then, other green chemistry texts have emerged. Even though Doxsee and Hutchison’s lab manual2 focuses on the organic lab, the information is widely applicable to green chemistry in general, and it forms more than just a simple collection of experiments. About a year ago, the Journal of Chemical Education began running a regular feature about topics in green chemistry. One can find lab experiments and different activities regarding green chemistry that can be integrated into the curriculum. In addition, Kirchoff praised the environmental chemistry text by Colin Baird and Mike Cann.3 It provides a breakthrough in terms of integrating green chemistry into mainstream textbooks.
A number of different ACS resources also exist. Introduction to Green Chemistry is specifically designed for high school students and is the most popular of the green chemistry materials that ACS has. Students at the high school level and teachers appear to be very interested in this topic. In terms of general public outreach, ACS’ Outreach video provides a good introduction of green chemistry. This informational video focuses on the work of three Presidential Green Chemistry Challenge award winners in a way that is accessible and understandable.
Kirchhoff and other speakers noted that chemistry should be considered in the broader context of societal issues. Beyond the Molecular Frontier,4 a National Academies report, specifically emphasized the need for scientists and engineers to understand societal implications in order to enhance stewardship of the planet and recommended a greater emphasis
on the human aspect of the scientific endeavor. ACS also has an ongoing project called Exploring the Molecular Vision, an initiative by the Society Committee on Education division to examine the reform of chemical education. Project participants have cited the need for emphasizing toxicity education, highlighting the role of chemistry in supporting the environment, and promoting high ethical standards in environmental performance.
Green chemistry, sustainability, ethics, toxicology, and safety issues are generally absent from the chemistry curriculum at this point. ACS recommends that green chemistry be taught at the high school, undergraduate, and graduate levels in terms of classroom lectures and laboratory training. The Committee on Professional Training (CPT) also stresses interdisciplinary work. CPT guidelines currently emphasize subjects such as economics, marketing, and business within an environmental context, generally pointing out connections between science and society. CPT also has an environmental chemistry option in which the ACS Committee on Environmental Improvement has recommended that green chemistry should be included.
Kirchhoff described different approaches for integrating green chemistry into existing courses and curricula. One method is to develop a whole new course around green chemistry, which has the advantage providing great depth into the subject. The disadvantage is that a new course is usually treated as an elective and therefore does not impact as many people as a required course does. Nevertheless, it is still an excellent way to introduce students to the real “nuts and bolts” of green chemistry. Another route is to integrate green chemistry into existing courses both within the classroom and the laboratory. This can be tricky, especially if educators use textbooks that do not include green chemistry, which requires them to be creative with introducing the subject into their courses.
Students should also be encouraged to explore green chemistry on their own. Kirchhoff suggested that, as an alternative to teaching research students the use of established methods, have them instead look in the literature for “greener” tools. There should be an over-arching philosophy of “what are you producing when you do this reaction, what are the by-products,” Kirchhoff said. Educators should consider this even as they teach organic chemistry. Many textbooks do not include the by-products. “There are by-products, and those by-products have consequences,” she said.
Conferences, symposia, and school activities provide great opportunity to educate students more about this subject. For example, ACS is organizing their third summer school on green chemistry for graduate students and postdoctoral researchers to be held at McGill University at Montreal in July. The ACS student affiliates program also recognizes green chemistry chapters at schools. Another often-overlooked opportu-
nity is the incorporation of sustainability into campus building construction and landscaping. St. Olaf College not only received $500,000 from the Keck Foundation to integrate green chemistry into their curriculum, but they also were awarded $98,000 from the Kresge Foundation for the design of an environmentally friendly science center. “In the end, what you ideally want is a building that is green, with a program that is green. Tie these two together,” Kirchhoff said.
There are several benefits in the incorporation of green chemistry into the education curriculum. One is professional preparation; as industry moves toward an increased emphasis on sustainability, they will need students who are trained in green chemistry and sustainability issues. Students themselves have an interest in environmental issues and in demonstrating that chemistry and environmental stewardship are not mutually exclusive. On a practical level, sustainability and green chemistry education can increase lab safety and decrease lab waste.
In terms of continuing education opportunities, summer workshops such as the program at the University of Oregon may help faculty members feel comfortable with introducing these topics into their teaching. Many faculty members are uncomfortable because they lack the background in sustainability or green chemistry education and practice. However, continuing education programs may enable them to teach and practice sustainability on their own campuses. Industrial chemists may need similar workshops. Many chemists who currently work in industry also do not have training in green chemistry or sustainability, so they also need to enhance their skills.
One participant pointed out that it might also be important to educate patients and doctors concerning the metabolism of drugs. There is a question of how to educate people to take 100 milligrams of a cox-2 inhibitor, instead of pressing for 400 milligrams, he said.
Mary Kirchhoff emphasized that it is important to hold students’ interest in green chemistry as early as possible, and to show them that that chemistry is not the grand polluter of the planet but instead offers solutions to some of the environmental challenges that we face.
DEFINITION OF GREEN CHEMISTRY
One of the recurrent themes was the search for a term for sustainable chemistry, and discussion about the use of the label “green chemistry” or “environmental chemistry” and its definition.
Some participants thought the term “green chemistry” did not do justice to the multidisciplinary and integrative nature of the projects they are working on. Some asked if the name green chemistry is not a detriment to what they are trying to achieve. Mary Kirchhoff said it might be a ques-
tion of moving “away from the model that, here is chemistry and here is everybody else.” She said there is a need to see that chemistry does not operate in a vacuum, especially at the industrial level.
There was also the question that the label green sometimes doesn’t just mean biologically derived; in fact, it means “stop and think about whether it really is better all the way through the life cycle.” Mary Kirchhoff pointed out that the term biologically derived is not so much of the confusion as is the term environmental chemistry. She said when she sometimes receives applications to review green chemistry chapters, many of them have included monitoring the pH in a local stream or cleaning up trash as green chemistry activities. “So, they are sort of confusing care of the environment with green chemistry,” Kirchhoff said.
Lauren Heine said it was a great challenge to define what is green. There is no green chemical, she pointed out. “Water, of course, you can drown in it,” she pointed out. But everything is context based, and the material and the metabolism in which it flows have to be considered, she said.
According to Heine, the lack of agreement on what defines a sustainable product or sustainable chemistry makes it hard for companies to market a new product. People are often averse to taking a chance in designing a green material, if they don’t know that that definition is going to hold up in the market place. “Because if somebody comes out with a different definition, they may well have invested in a green product that is not perceived of as green,” she said.
Heine pointed out that companies should not focus on developing just one or two green chemical products, because it does not demonstrate real commitment to customers, and it does not educate customers. It increases vulnerability at the corporate level if companies only make green that which is profitable. But if a company commits to a corporate-wide strategy of green chemistry, the company will be respected.
One participant said the environmental performance and life cycle costs should not be put into an MSDS sheet, because an MSDS sheet is more regulatory and compliance based, that green chemistry deserves its own sheet.
Berkeley Cue said green chemistry is the definition he is most comfortable with. It is the utilization of a set of principles that reduces or eliminates the use or generation of hazardous substances in the design, manufacture, and application of chemical products. He added that many people think green chemistry is just about organic chemistry. Analytical chemistry, physical chemistry, inorganic chemistry, biochemistry, all of the disciplines that interface with chemical synthesis are covered by this definition, Cue said.
He reminded the participants of the 12 principles of green chemistry
as articulated by Paul Anastas and John Warner, the most important being prevention. “It is better to prevent waste than to have to deal with it once you have produced it,” Cue said. He listed atom economy, less hazardous chemical synthesis, safer chemicals, design for energy efficiency, renewable feedstocks, catalysis, and finally, design for degradation. From the pharmaceutical industry perspective, this is the biggest challenge. The challenge is for the molecules to be stable when they are synthesized, stored, and incorporated in the dosage form. They then should have at least a two-year shelf life where there is no appreciable degradation. They should be stable in the patient when they ingest them, because the active drug has to get to the site of action. Then, as the drug leaves the patient through the biological processes, it would be ideal to have them completely degrade into innocuous materials.
There is also a series of 12 green engineering principles that go along with the green chemistry principles. Material and energy inputs and outputs are as inherently non-hazardous as possible. Processes and systems designed to maximize energy, space, mass, and time efficiency, embedded atrophy and complexity, should be viewed as an investment when making design choices. We should target durability and not immortality. Design for all unnecessary capacity capability, one-size-fits-all solutions should be considered a design flaw.
Brad Allenby used CFCs (chlorofluorocarbons) as an example to show how the understanding of what is green has changed. CFCs are a classic example of green chemistry because they substituted for fairly toxic, dangerous materials. “If I apply any of the metrics, the heuristics that we tend to use in green chemistry—lower toxicity, lower impact on workers, safer for users, more stable—I would love CFCs,” Allenby said. What made CFCs desirable on one scale—their stability—made them undesirable when they got up into the upper atmosphere and began to break up the ozone. He also pointed out that this happened on a very small scale. “If you were looking at volumetric chemical consumption in the United States, you would not have looked at CFCs,” he said. They were a minor trace atmosphere and yet, because of the dynamics of the system, they turned out to be extremely critical. “So, a green chemical de-stabilized major earth systems with effects that we are probably not entirely familiar with at this point,” he said.
CFCs show us that at the time, we did not have the ability to know how to think about what was or was not green, Allenby explained. This means there are gaps in the way that chemistry and its impact on global systems is thought about. But Allenby also said the response to CFCs was positive; alternatives were found, and used where CFCs had been employed, in the electronic sector, cleaning circuit boards, cleaning piece parts. This means the other lesson from CFCs is not to go into paralysis
mode just because a green chemical may be potentially harmful some years down the line.
Another example is to reflect on green chemistry network components, Allenby said. None of these products exist by themselves. They all tend to be used in a particular context, a network structure, especially something like a telephone. “If you just have a telephone and there are no towers and nobody else has a telephone, then you have got a really kind of interesting paperweight and that is it,” Allenby said. The technology of designing a telephone and designing the way different components and materials work in a telephone is very complex.
One of the mistakes that is made in policy, teaching, and thinking is that there seem to be things that are so bad a ban is immediately needed. There are some cases where that has worked well; lead in gasoline is a classic example.
Allenby recounted moving to corporate intranets at AT&T. This significantly reduced the material demand of the company in terms of paper. But corporate intranets also add to the value of the whole set of material-based products, like telephones with lead in them. “Is that good or bad? Is that green or is that not green? I don’t know,” Allenby argued.
Part of that is the refusal to understand that green chemistry does not operate at the scale of the bench or at the scale of the reaction. It operates at the scale of regional and global systems. “To me, that is an irreducible responsibility of green chemistry and it is one that so far, I think, has not been adequately addressed,” Allenby pointed out.
Allenby said green—not environmental science, but green—is a fairly normative kind of concept. Green chemistry injects the normative into the heart of what has traditionally been a physical science. He reminded the audience of C.P. Snow and his theory of social science versus physical science, which are thrown together in green chemistry. “It is a very, very interesting sociological phenomenon, which is not what I think green chemistry intends to be, but I think it clearly is,” Allenby said.
He went on to point out that given the scale of human activities, scientists are actually doing earth systems design and engineering. He gave pharmaceuticals as an example. A pharmaceutical is designed to have a specific impact in a specific human system but, at the scale at which any successful pharmaceutical is used, through any metabolic processes that result in products that are released into aqueous systems, it is also designing aqueous systems in developed countries.
Data from a number of different fields shows this link clearly, although it is not clear from the way pharmaceuticals are thought about, or regulated and taught. “Unless we understand that something that is designed at bench scale will, in fact, in many cases, impact systems at regional and global scale, we have not yet begun to grapple with what is
already occurring in our world; not what is going to occur, what is already occurring,” Allenby said
Allenby said it is important to learn to how to pay more attention to scale, to know that metabolic products are going to end up widely dispersed in the environment. “We should ask if we should be designing not just the pharmaceutical, but also the metabolic product,” he stressed.
Earth systems engineering and management is where the serious business of green chemistry begins, and it is an area that hasn’t really been focused on enough at all. “Where is our sense of responsibility? That doesn’t mean simply retreating to ideological structures. Ethics and values need to be comprehensive. We need to learn how to dialogue with these systems, and we need to develop the institutional capability,” he said.
Green chemistry is classic white space, it imports a lot of concepts from social science and in particular, historically contingent viewpoints into the practice of chemistry, which is very much a white space practice. Unfortunately, Allenby said, we are not really good at dealing with white spaces. A strongly disciplinary scientist or engineer, will tend to view white space work as being fluffy, and an admission by the person doing it that they couldn’t handle the discipline. It is awfully hard to get interdisciplinary work funded. The staff at the NSF tends to appreciate the importance of interdisciplinary work, but the peer review committee thinks a multidisciplinary researcher is a complete flake, and the process breaks down.
DEMAND AND PRODUCT DESIGN
Lauren Heine gave the participants some background on GreenBlue, and cradle-to-cradle design. She talked about some of the drivers and obstacles for green chemistry, product formulation, and gave some examples of projects that are designed to facilitate adoption of green chemistry.
Lauren Heine talked about the furniture flame retardency partnership and the Design for Environment (DfE) Green Formulation Initiative for cleaning products. She also gave an example of a company using green chemistry as a strategy for product development. The furniture flame retardency partnership was an EPA project, part of the DfE program.
GreenBlue is about a year and a half old. It is a not-for-profit organization in Charlottesville, Virginia, that was founded by William McDonough, a well-known green architect, and the German chemist Michael Braungart. They wrote a book called Cradle to Cradle: Remaking the Way We Make Things. The book argues that industry and the public can use the following design principles: using current solar income, celebrating diversity of people, products, geographies, cultures, needs, and design, and that waste equals food. The waste equals food theme is very
powerful, Heine explained, as biological materials can be perceived of as nutrients, flowing within biological metabolisms, and technical materials such as metals or polymers can be perceived of as technical nutrients, flowing in technical metabolisms.
The value of these materials gives rise to the thought of designing an entire system, and a biological metabolism may be the environment in the broader sense or it also may be a wastewater treatment plant. There has to be thought about designing the biological material, so that it can be metabolized. Sometimes the focus is on designing materials for existing metabolism, and sometimes on designing metabolisms for materials.
In the application of these ideas, the first step is to analyze the chemical composition of materials used, to select materials based on safety to humans and ecological systems, and then to design these materials to be nutrients, for high-value recovery or other beneficial uses. Energy recovery could be considered one of the recoverable values, Heine argued.
Heine talked about the cradle-to-cradle model. There is value in a big picture mental model, like cradle-to-cradle design, she said. While it initially may sound hokey to an engineer, it is powerful because it engages technical and non-technical people alike. Secondly, it provides a vision, but not a prescriptive approach. Thirdly, the focus on materials and metabolisms points to the importance of systems, and collaboration with others within the value change.
GreenBlue sees this as a short-term strategy to move companies and organizations toward sustainability. How well the strategies will work 10 or 20 years from now is not clear, as there may be more important strategies to take.
Heine then named some of the obstacles to integrating green chemistry into product formulation. First, change is always difficult, she said. There are huge manufacturing and market challenges. There is a big customer disconnect. Manufacturers will say, we have the brain power, we can make anything, but our customers are not asking for green materials. They might say, we make green chemistries, but our customers aren’t buying them. There is a big disconnect there. A lot of human and ecological toxicology and life cycle impact data is missing that could support decision making. There is a lot of data out there, but it is not necessarily in a form that can support decisions.
Two examples are material safety data sheets (MSDS) and technical fact sheets. People look to these to help support decision making, but they are not always very useful. Heine showed an MSDS for a green product that did not show any ingredients. MSDS’s are often wrong, they are generally incomplete, and there is no standard format for them. There is an American National Standards Institute (ANSI) format that is a very good format, that includes environmental attributes, but there are no require-
ments that everybody needs to use the same MSDS format, and they only have to report what is hazardous anyway.
Technical fact sheets can be very useful to formulators, because they give an idea as to whether or not it is useful to try this ingredient, based on performance properties. However, they are very inconsistent. Environmental attributes very often are not listed in technical fact sheets, and only the information the manufacturer sees as most relevant can be found. “It would be very helpful, I think, to have environmental attributes consistent, whether they are positive in that case or not, available to people to compare,” Heine said.
Heine talked about the drivers of green chemistry, for example a recent phase out of penta- and octa-brominated diphenol ethers (PBDEs) and a pending national regulation requiring flame retardancy in furniture.
First she talked about flame retardants and the DfE-formulated flame retardant partnership. People desire to avoid making the same mistakes as they did with PBDEs. The Furniture Flame Retardancy Partnership was formed with the purpose of providing up-to-date toxicology and environmental information on flame retardant alternatives to the pentabrominated materials used in polyurethane foam and to identify environmentally preferable approaches to designing furniture that meet the pending fire safety regulations.
The impending tasks of the effort include identifying and evaluating the existing chemical substitution for penta-brominated materials, targeting research and data needs, investigating non-chemical additive approaches, such as barrier technologies, construction techniques, batting fill, alternative formulations of foams, and possibly posting targeted DfE innovation challenges to identify chemical and non-chemical solutions.
Heine then showed some of the information sheets. The sheets allow manufacturers to look at a particular flame retardant chemical and compare it on different end points. For example, one can examine whether the flame retardant is additive or reactive, which affects the exposure potential. The two-and-a-half page matrix is the distillation of about 450 pages of research on the chemical alternatives, done with the EPA and Syracuse Research Corporation.
The sheet allows consumers to choose which attributes are important to them. Acute aquatic toxicity could, for example, be less of a concern, while very low persistence and bioaccumulation potential could instead be the highest priority. “None of the existing options are perfect, but at least this model, I believe, is a really nice way of presenting data to help people make choices,” Heine pointed out.
Heine then went on to talk about the Green Formulation Initiative for cleaning products. The drivers for green chemistry in product formulation are an executive order for government purchasing called Greening
the Government Through Waste Prevention, Recycling, and Federal Acquisition. The U.S. Green Building Council LEED program has an existing building program that gives points for green cleaning products and for green cleaning programs.
Eco-labels are growing, Heine said. Green Seal has certified about 130 different cleaning products. Canada has the Environmental Choice Program and Europe has its programs.
Heine introduced the U.S. EPA Design for the Environment formulator program, which is a partnership where companies can partner with DfE. The companies submit their formulations to the EPA’s technical expert staff for review. This team reviews all ingredients in the formulations and identifies ingredients of concern. The manufacturers are then responsible to finding alternative chemicals and reformulating. If successful, they can use the DfE logo on their product labels. This is a very powerful learning experience for the formulators who engage in this partnership. “They really love this program, and are very proud of the success of the partnership,” Heine said.
GreenBlue has a related project, in part because there is so much demand for that service at EPA, and it is a very small program. GreenBlue is creating a resource to promote green chemistry in the design of industrial and institutional cleaning products and to enhance environmental and human health and safety. It is a multi-stakeholder process with about 170 or more formulators, raw material suppliers, industry associations, and NGOs working together to establish the relevant attributes and supporting data needed to identify ingredients that can be used to design cleaning products with potential environmental benefits.
GreenBlue is starting with surfactants. Steps include: identifying the attributes, building the database, soliciting ingredients and supporting data, and then posting this information to a public website to promote the greener chemistries. The initial attributes of concern for surfactants will be biodegradability (including consideration of breakdown products), aquatic toxicity, skin irritation, and additional product features such as percent bio-based. The attributes may be of toxicological, regulatory, policy and/or of eco-labeling significance. It will be a fact-based resource for formulators that allows them a one-stop opportunity to select ingredients to enhance the environmental profiles of their products and to communicate ingredient information.
The information will liberate manufacturers. Putting the data out there will allow formulators to consider environmental information when making their choices. Formulators do not want to be told how to use a particular ingredient and whether or not it is “green”. They want to see the information so that they can make their own choices and determine the environmental benefits based on the application, Heine said.
Heine went on to talk about a small formulating company, Coastwide Laboratories, in Oregon that has used a green chemistry strategy in developing their products. They have established positive criteria for product efficacy and environmental health and safety by creating a product development standard that is publicly available on their web site. They thoroughly assess all candidate ingredients to understand their potential human and environmental health impacts. Then they formulate and re-formulate new and existing products to meet the standard. The company has received external verification of their products’ environmental profiles through eco-labels and the DfE Formulator Partnership
Their full corporate commitment—not to just a few green products—but to all product development has paid off, Heine says. Their sales growth from 2003 to 2004 went up 430 percent. People are asking to license their technology. The company is educating all of their sales, marketing and technical staff as well as customers in their community about the value of green chemistry in their products.
Heine said programs like EPA’s Formulator and Furniture Flame Retardancy partnerships are voluntary programs that are funded by minimal amounts. However, they provide a huge educational experience for industries, especially smaller companies.
Heine said she has a small company in mind when she talks about a strategy of green chemistry “I am not sure how you implement that at a very large company, but I think the principles will hold as well. You need a green chemistry strategy and a corporate goal—to say where you are trying to go,” Heine said.
Berkeley Cue talked about the importance of green technology to chemical enterprises, and the business argument that needs to be made to convince companies to be more active in green chemistry. He narrated some green chemistry success stories, and gave an overview of some technologies that enable green chemistry and engineering. He also talked about some of the barriers to adopting new technologies in green chemistry and engineering.
Cue introduced the concept of the triple bottom line, which was articulated by Elkington in the late 1990s, and is generally regarded as an important consideration in business success. It states that simply focusing on economic success without equally focusing on environmental stewardship and social responsibility is not the right business model for sustainability. “In fact, some people have said that the companies that largely focus on this are simply green washing the issue, and they are not paying attention to the important parameters that they should,” he added.
There is a double economic penalty in sustainable production, Cue said. Only ten percent of the raw materials taken out of the earth actually wind up in goods and services. Ninety percent of them wind up as waste, and re-enter the environment as pollutants. Companies pay to use them and pay to dispose of them.
But even a 10 percent conversion factor would be great for the pharmaceutical industry, Cue said, which has a conversion factor of less than one percent. The industry has always argued that they produce more complex molecules, through more complex synthesis, with lower overall yields, so it is no surprise that it would be like that. But, Cue argued, it might not have to be.
One of the other driving factors is the presence of what Cue called “six billion consumption machines.” If everyone had the same quality of life as the population of United States or Western Europe, three planet earths would be necessary in order to provide sufficient raw materials.
That combination of a ten percent conversion rate for raw materials, and the need for three earths for the emerging countries suggests the need for at least a three-fold improvement in the conversion of raw materials to goods and services. Since the demand grows as the global population grows, “there is a certain amount of urgency involved here to addressing these issues,” Cue said.
Change may have to be the initiative of the companies themselves. For a long time companies have tried to live in a compliance mode; they reacted to regulations that were promulgated, which was feasible until about the 1960s, Cue said. But any company that believes it can exist by simply trying to comply with regulations, including regulations that have yet to be promulgated, is probably following the wrong business model, Cue said. This compliance costs U.S. industries over $200 billion, $200 billion that could be better used elsewhere, investing in R&D to discover new products or newer valuable products, Cue pointed out.
Responsible care is another important business driver for adopting green chemistry and engineering principles. That is a binding obligation of the chemical industry: self-responsibility in the area of health, safety, and the environment.
Another important driver is REACH (Registration, Evaluation and Authorisation of Chemicals),5 and the European regulations that are being introduced. The European Union with its 450 million citizens outnumbers the U.S. economic market now, and those regulations will have an impact on the United States if the U.S. is not more proactive in terms of
addressing some of the shortcomings. But companies may also gain through REACH. “One of the things our customers are going to learn, as they work through REACH, is that the gain from going from 70,000 chemicals on the chemical inventory, to 15,000 chemicals on the chemical inventory is going to be an enormous gain to them,” John Carberry said. The regulatory process is going to drive chemical simplification toward chemicals that are known to be safe, which is a big switch from chemicals that are not known to be hazardous.
Another reason to promote green chemistry is homeland security as this would reduce hazardous waste that could be used by terrorists as a weapon. “A rallying cry would be terrorists hijacking a tanker truck with 8,000 gallons of hazardous waste, driving it to Times Square, and kicking open the valve,” Cue said.
Waste reduction could be a big incentive for green chemistry in the pharmaceutical industry. The pharmaceutical industry pointed to the petrochemical industry for a long time, and said: “We don’t make large quantities of material, relative to those guys.” However, it does produce an awful lot of waste overall. Cue calculated that the pharmaceutical industry could be producing somewhere between half a billion and 2.5 billion kilos of waste for every kilo of active drug produced.
This amount can be reduced, Cue argued. By applying green chemistry principles companies could see a dramatic reduction in waste, by an order of 10-fold. Cue lauded Glaxo for doing a very nice job in life cycle analysis, which not many pharmaceutical companies do. Cue then called for one of the outcomes of the conference to be an increase in cross-talk between the pharmaceutical industry, the oil industry, the fine chemical industry, and the bulk chemical industry, in order to share best practices and to learn they each address these important issues.
Some of the success stories of green chemistry are legislative ones. Cue described a bill pending last year designed to integrate the federal government approaches to green chemistry by the EPA, the Department of Energy, National Institutes of Health, and NIST. It passed the House of Representatives, over 400 yea, to 14 nay. It was sponsored by Senators Snow and Rockefeller in the Senate but did not get called to vote before the last congress expired. It has to go through again.
Cue also called the Presidential Green Chemistry Awards a great tool for encouraging more green chemistry and engineering in the chemical enterprise industries. He recounted the experience of working for a company that won one. “It is one of the most motivational things that you can ever experience, and will really drive the entire company to look for other examples of success stories to submit for applications,” he said. There are five award categories, including small business, academic investigator, alternate synthetic pathway, alternate reaction conditions, and design of
safer chemicals. In the pharmaceutical industry, Lilly, Roche, Pfizer, and Bristol received awards. It is important that the pharmaceutical industry be very active in these areas because their mission is to bring innovative health care solutions to patients, but if that happens at the expense of a healthy environment, that is an incomplete mission, Cue said.
The award for Bristol—won in collaboration with Phyton—was given for an improvement in the manufacturing of paclitaxel, for plant cell fermentation from renewable nutrients such as sugars, amino acids, vitamins, and trace elements. The company ferments paclitaxel directly, without a need to stockpile needles. Overall, they have eliminated 10 solvents and six drying steps.
Some of the enablers of greener manufacturing process will be biotransformations, process analytical technology, robotics and automation, crystal engineering, separations technology, green solvents, microwave chemistry, equipment cleaning, and bio-based raw materials.
As an example, Cue talked about the work of Codexis, a biotransformation company in California. The company isolates genes from natural enzymes or substrates, which facilitate biotransformations. They then cut them up with DNA shuffling, generate a library of novel genes by recombining them in random methods, screen them for yield improvement, and find novel genes with improved properties. The president of Codexis claims that if the yield can be detected, they can develop a manufacturing process with a high yield.
The basic building block of cephalosporin antibiotics—7080CA—was made this way. It starts out with penicillin G, which is made from fermentation of feedstock and phenylacetic acid. Using the proprietary technology, they basically found a more streamlined conversion. In terms of yield efficiency, and in vitro activity, which is very low with the traditional process, there is about a 40-fold improvement in overall yield in the engineered process.
There is a need to look for potentially relevant technologies outside of the chemical industry, Cue said. For example, Foster-Miller is a company that manufactures a military robot to investigate potentially hazardous materials, and the same technology allows them to design lab automation for the pharmaceutical industry for a totally automated, analytical laboratory.
One of the challenges is the science of scale. One goal is to go right from the lab to the 4,000-gallon reactor. Another way is to run a whole bunch of small reactors, and that is what Velocys does with their micro channel reactor approach. It has some advantage including high yield and selectivity, and eliminating the need for catalyst recovery, because the catalyst is embedded. So, based on how much you need to produce determines how many modules you need and, at the individual reactor level, they are small and inexpensive.
Another important technology is product purification and particle size control through impinging jet crystallization. A synthesized molecule has to be recovered in a pure state, which is a process that is energy-, solvent- and waste-intensive. The pharmaceutical industry not only needs the crystalline form but also a precise particle size. In impinging jet crystallization, there are two streams of solvent. One stream carries the material to be crystallized, the other is an anti-solvent. The streams are slammed together, and the concentration and velocity determine the particle size.
Simulated moving bed, or multiple column chromatography is another area that is getting a lot of attention in pharma. The costs have come down for the equipment and the column material, a substantial reduction in solvent, and the FDA is much more comfortable with the technology now. Companies are using it to produce commercial quantities of material, and in the early stages of clinical supply synthesis. There are a lot of predictions that in the future, drug companies are going to need this technology more and more.
Microwave chemistry is another area of great interest. Right now it is an interest that is still in the laboratory stage, because there is a bias that it is not going to be engineerable and scalable.
Solvents are getting a lot of attention in green chemistry and engineering. Ionic liquids, for example, are seeing a tremendous growth in the number of publications. Near critical and super-critical solvents, for which Liotta and Eckert won a green chemistry award last year are very interesting. There is no reason, with proper engineering technology, that those temperature and pressure ranges are impossible to operate at, and some new chemistry that will lead to new products may be uncovered.
Equipment cleaning is a further important field, as a large amount of solvent or water waste is generated in cleaning chemical and manufacturing equipment. Right now, spray balls are kind of the gold standard, at least in the pharmaceutical industry. Some companies have looked at using ultrasonics, but the problem is that the detergent used is abrasive to the glass that lines many of the reactors.
The metrics of green chemistry are another big challenge, Cue said. He said an agreement on the metrics, the definition of the metrics, the measurables, and how to measure them would be important. The drug industry has a bad track record with incorporating technology, and this could actually be a negative. People could assume it hasn’t been successful up until now and question the success of a new technology.
LIFE CYCLE ANALYSIS
Richard Helling talked about some of the drivers for sustainability metrics in general and life cycle analysis in particular, and some of the
key challenges revolving around data quality and availability, the art of impact assessment and how it is done from a methodology point of view. “In order to change something, you have to be able to measure it, and you have to be able to do it quantitatively, and life cycle analysis gives us one way of doing that,” Helling said.
Helling first explained which characteristics make a good metric. A good metric needs to be scalable, understandable, reproducible, and be able to describe all three dimensions of the sustainability environment. For economic metrics, there are a number of well-established descriptions for performance and accounting standards. A recent innovation is to look at what is called total cost accounting, which is looking to the future to quantify some of the noneconomic factors and convert them into dollar terms, so that a comparison based on a common dimension, dollars, is possible. “It is great in principle, a little hard to do in practice, because there is a lot of subjectivity and forward looking uncertainty with that,” Helling said.
Life cycle analysis is still not well established, and there is not a globally agreed-upon set of metrics. In general, LCA starts by looking at the intensity of energy use or mass use or pollutant emissions, such as the kilograms of CO2 per kilogram of product, or by looking at the economic side—the dollar value of production per megajoule of fossil fuel use—as an eco-efficiency. These values can be quantified in life cycle analysis.
Sociometrics is another relatively new area of research. While there is less consensus on what makes a good sociometric, it is clear that, when something is measured, people’s behavior can be changed. A clear example is to look at industry’s tracking of work place injuries, which have been reduced by the order of a magnitude over the years.
Many metrics have been developed for nations, or for looking at the status of the planet, rather than for individual companies or projects. But it is unclear if there is a need to find the best set of metrics for the chemical industry or to find separate and distinctive measurements on the social dimensions for each case.
Life cycle analysis is an analytical tool for the systematic evaluation of environmental aspects of a product or service system through all stages of its life cycle. Importantly, it is quantitative and looks at the full life cycle and does not just move waste or problems from one stage to another. It is a comparative tool, with standards from the international standards organization (ISO). The framework of classic life cycle assessment from the ISO standards defines four parts in the process. There is the goal, scope, definition, and inventory analysis.
Impact assessment asks the question of how the product and the emissions relate to mortality, and the interpretation asks what to do with the data. “The how is very well defined by these standards. The what, though,
is really left to the particular study that you are trying to do,” Helling said.
But in looking at LCA results, one has to consider the very specific question that the authors are trying to answer, because they may not be trying to answer the exact same question that you are interested in. The inventory calculations need a tremendous amount of data or time, or usually both. Then, in the impact assessment there are a lot of different choices to be made of how to relate emissions on a mass basis to impacts, either midpoint on global warming potential or on mortality, for example.
The first challenge for a new LCA practitioner is picking the right software tool. There are many tools available; universities, national labs, and companies have developed their version of the tools, all of which can vary in terms of their specific strengths, weaknesses, and costs. BEES and TRACI both exist on web sites or are publicly available. OMNITOX is being created in Europe and should be coming out soon. But there is a risk to software that is not self-developed, in that a certain amount of faith or understanding in how it is going to work is needed. While it is very easy to generate numbers, it is more difficult to understand what those numbers mean.
Helling presented an example of life cycle assessment, looking at a flexible foam polyol, such as might be used in seat cushions in cars. It concerned the possibility of using a soy-based material and comparing that to petrochemical alternatives.
A very key first step is to define the functional unit, the level of performance that will be compared, and the different options for meeting that level of performance. In this case, it was very important for the product to have certain mechanical properties. The soy-based route was not 100 percent soy based, and a certain amount of petrochemical derived ethylene oxide was required to get product properties identical to those currently used.
The study was done as a cradle-to-gate impact assessment, meaning that the end use, fate, and disposal of the product were the same, regardless of the route. This limited impact assessment, as some impacts were stopped at midpoint, such as global warming potentials. It also did not estimate any health or toxicity impacts.
The basis for making polyols in Europe was used as the reference case for the petrochemical route. This is based on a survey of 12 different sites using three different technologies in 1998 in four countries. Another route based on patents published by BASF, using hydrogen peroxide as an oxidant instead of chlorine, was also used.
A typical result from a life cycle analysis might show the megajoules of gross energy consumed per kilogram of product. The 1995 data, the BASF patent information, and the two different farming models for the
soy polyol were available. In summary, the soy polyols gave a very exciting opportunity to reduce the total energy of the product to about 65 to 70 percent of what it is today. In looking at just the fossil feedstock component, they require only 40 to 45 percent of the energy intensity, which is better than a lot of potential biomaterials.
The BASF route, by not using chlorine, decreased mass intensity and, more important, freed this route from chlorine infrastructure. The use of chlorine can be very economic if the infrastructure is in place but, if not, it is rather expensive, and there are concerns about dealing with chlorinated by-products.
There was a big difference between the greenhouse gas emissions from the two different farming models for making soy-based polyols. This is due to the different assumptions of the generation of nitrous oxide from the use of fertilizers in the farm. The two different groups took two different approaches. They both looked at the same data but came to different conclusions, and this has a big impact on the overall view of the process.
Helling then talked about some of the challenges facing the chemical industry. There is an opportunity to gather and share information more effectively on the broad scope of the chemical industry. A lot of databases have been set up for specific applications, like in plastics for building materials, but not necessarily for the breadth of the chemical industry. Sharing information really started with the APME work and on the eco–profiles from Europe. It is continuing now with the American Plastics Council work in order to do the same thing as part of the U.S. life cycle database initiative being led by DOE and NREL, and Dow is fully contributing a lot of data for that effort. There will probably still be segments of the chemical industry that aren’t covered by that because the focus is plastics, though. The need for access to data is also recognized by the UNEP SETAC life cycle initiative, which is another recently launched initiative.
In the area of impact assessment, a consistent procedure is needed. This means the most appropriate impact assessment methodology for chemical processes should be found. Helling credited BASF, who has done a superb job advocating for eco-efficiency analysis.
Some of the participants wanted to know why the tools come from a negative perspective. They were interested in balancing costs or greater negatives with greater benefits. They were looking for a way to add a benefits analysis along with cost assessment. Helling said it was a matter of definition. “Do you want zero to be good or infinite to be good?” he asked. Terry Collins said he was not aware of a single case where a pollutant put out in the environment has had a good effect, the inverse of toxicity or eco-toxicity. He added the damage that was done to American society by lead in paint and gasoline is incomprehensible, and it will never be
quantified, because there will never be the experiment of having a civilization without lead.
Richard Scott asked if LCA took the scale of technologies into account when they are adopted on a widespread basis. Richard Helling answered that any time Dow looks at a new product, they look at its unit cost, and at total capital cost to get into that business, or the total market size, which will be unit cost times some hopefully very large number. The same approach is taken to look at the LCA results.
Robert Kavlock talked about computational toxicology, which uses the best of modern chemistry, information technology, and biology to do a better job at assessing risk. He talked about how the EPA views this research strategy and presented some of applications the EPA is using.
The EPA views the world through this source to predict outcome paradigms, Kavlock said. An event takes place in the environment, a transformation followed by an exposure, which is a contact of a chemical with an organism. That is translated into an internal dose, which becomes a biological event, and eventually an adverse outcome, which the EPA regulates.
That has been the driving force in coming up with a computational toxicology program, Kavlock said. There is a lot of data for some kinds of chemicals. For example, the EPA asked the regulated industry for about $19 million in studies for pesticides, but there are other cases in which the EPA does not have legislative authority to ask for data.
Congress constantly gives the EPA new lists of chemicals to worry about, like endocrine disrupting chemicals, pesticidal inerts or high production volume chemicals, Kavlock said. The EPA has no way to sort through these and look at the risk-based criteria for setting testing priorities. So it really needs a different way of evaluating the way it approaches prioritization, screening, and testing.
Kavlock gave one example of the problems EPA faces. This has to do with so-called pesticidal inerts or nonactive ingredients. These substances are not necessarily chemically or biologically inert, but they are not the active ingredients. The EPA issues what are called tolerance exemptions for these. To do that, the agency conducts a risk assessment, for which there are no data requirements. “Basically, what we wind up doing is a margin exposure estimate, where we calculate what the likely human exposure is going to be,” Kavlock said. This is done through literature research and toxicity estimates. The burden of proof is that there is a reasonable certainty of no harm. The problem with this is that there are 850 pesticidal inerts being used. There are no testing requirements, and the agency has been told to finish this work by August 2006.
This list goes on, Kavlock said. There are 350 non-food use antimicrobial agents in the same category, and 3,000 food use inerts. Because of that, the EPA four years ago began to think about how it could approach doing a better job of looking at these kinds of hazards.
Now, finally, this year the National Center for Computational Toxicology will open, Kavlock announced. It will have a small group of systems modelers, computational chemists and bioinformaticians. Concordant with this, there is a request for applications for a center for environmental bioinformatics, which will cost $1 million a year for the next five years, to help sort through the enormous amounts of data that this program will generate.
The EPA is planning to use computational chemistry tools and proteomic, genomic, or metabonomic technologies and apply them to risk assessment. This will aid the agency in screening and prioritizing chemicals through an understanding of the toxicity pathways with which these chemicals interact. In five or ten years the agency hopes to have faster and more accurate risk assessments.
There are two objectives in the framework, Kavlock pointed out. The first is to improve the linkages and their source to outcome paradigms. The FDA has been good at looking at individual steps of that paradigm, but not predicting what will happen at the next stage, Kavlock explained. There could be freight and transport models. There could be physiologically–based pharmacokinetic models. There could be biologically–based dose response models, or a version of system biology models, which basically show how a chemical interacts with the normal biology. The second objective is to provide predictive models for hazard identification, and this involves quantitative structure activity determinations. The final step is enhancing quantitative risk assessments. These technologies can cross levels of biological organizations. The goal is to find out how looking at the cell level can inform the organism or population level, Kavlock said. Other questions are how to translate from high doses to low doses and how to extrapolate across species.
The concern is not just with human health, but also with wildlife health. The EPA is using some molecular biology tools to sequence, express, and clone estrogen and androgen receptors from a variety of ecologically relevant species. Chemicals are then tested against them to see the similarity with estrogen or androgen receptor binding.
Kavlock talked about the national center for computational toxicology. It will have about 20 people in it when it is fully formulated this year and function as sort of a think tank to the agency. “The center will have a strong emphasis of developing partnerships, because in the era of the U.S. government, we have to stretch our resources as far as you can go, and there are a lot of other organizations out there that have very similar motivations,” Kavlock said.
Likely focus areas are information technology and prioritization. “How do we get into that list of 850 pesticidal inerts and tell the agency, these 63 are the ones you should worry about, and you should worry about 21 of them for birth defects and 32 of them for cancer effects, and be able to put some kind of a priori knowledge in the system?” Kavlock explained. Quantitative risk assessment models are primarily focused around physiologically–based pharmacokinetic models, and how they can be more routinely used in risk assessment.
The center has already developed some partnerships. The Department of Energy has helped with some of the genomic work. For instance, there is no gene chip available for the fathead minnow. The Department of Energy has helped sequence part of the fathead minnow, so that the EPA can develop microarray chips and be able to do some of the same kinds of studies in wildlife species as you can do now with rodents.
Kavlock also talked about some of the other partnerships. The Department of Defense has several ongoing efforts such as new program in eco-toxicogenomics. The EPA is also working closely with the National Center for Toxicogenomics and the national toxicology program to do genomics and prioritization techniques.
One example is the work with hormone activity. In 1996, Congress passed the Food Quality Protection Act that stated the EPA needed to screen chemicals for estrogenic activity. The agency employed an expert panel that came back with a number of screen assays based upon whole animals for looking at estrogens and androgens as well as thyroid hormones. There were a number of assays recommended, and their total cost exceeded $250,000 to $300,000 per chemical, with the number of chemicals screened in the range from 1,000 to 1,500.
The EPA has tried from the start of the program to think about in vitro studies to short–circuit some of the testing, and in silico studies to actually avoid the use of tissues altogether. Kavlock said the assay might be recommended for replacement of an animal test.
The agency is also pursuing a number of activities with quantitative risk assessment. Basically, it is developing a three-dimensional plot of the TSCA (Toxic Substances Control Act) inventory. The challenge is to predict the binding of chemicals that are out here. One of the challenges is just to understand what that chemical space is, and then develop strategies for sampling chemicals out in further domains so that more robust and quantitative structure activity models can be developed.
Another of the agency’s approaches approach is completely in silico. One of the EPA’s researchers is collecting crystallized nuclear receptors for androgens, estrogens, and thyroid hormones. The scientist extracts the ligand computationally. Using computational models, he can put other chemicals in that receptor and see what the dynamics are and how they
fit. About 40 chemicals have been studied across a number of different mutant receptors with this approach, and it is actually correlating and binding very well.
But there is more to do, as it is not good enough to look at the parent chemicals, Kavlock pointed out. Even for simple chemicals like bromobenzine, the metabolism can get fairly complicated. The metabolites could be causing the toxicity. One of the projects in the EPA’s Athens laboratory is a metabolic simulator, and it predicts likely metabolites and gives the probability that they are going to be present, and whether they are terminal metabolites. Another display will show the metabolic profile of the probability of different ones being formed, and it can be made into a QSAR model. Then it will be possible to say: “Well, it is the metabolites that we think are going to be more active or less active”. It is a more comprehensive approach of structure activity relationships, Kavlock explained.
Kavlock then talked about some of computer databases that are useful for computational toxicology. ECOTOX, for example, is not a product of the computational toxicology program but holds information sources on ecological effects. It has over 500,000 scientific records covering several thousand species and close to 10,000 chemicals. It is available on the web. “You can go there, you can query your chemical, and you can see all the known literature that is available on these chemicals,” Kavlock said. It is a peer-reviewed database with very strict standards for data acceptance.
Another project is called DSSTox, distributed structures searchable toxicity database. It captures available public toxicity databases, cleans them up, annotates them for clinical structure, and then provides them in standard data format files.
Kavlock then talked about a toxicogenomics database that deals with a group of chemicals called the conazoles or fungicides with which the pesticide office has been concerned. Although the substances are all conazoles, they have very different toxicity profiles. Some are liver carcinogens in mice; others are testicular toxins in rats. To determine why these chemicals cause different profiles, the EPA is trying to use a combination of genomics, proteomics, and metabonomics to understand that.
Using acute genomic expression profiles, the agency hopes to sort through chemical toxicity and look for toxicity pathways more efficiently. Two companies actually have the same kind of approach with much bigger databases; one is Miconics. They now have looked at almost 600 chemicals, and have come up with these genomic fingerprints that they state are predictive of chronic health effects from acute exposure. Kavlock ended by saying toxicology and green chemistry both have a lot in common. “Have you made one monster from another, have you traded one devil for another devil?” he said.
CURRENT FOSSIL FUEL DEPENDENCE AND FUTURE ALTERNATIVE FEEDSTOCKS
Stanley Bull first talked about the history of feedstocks. Historically, feedstocks were dominated by wood. “We were in the renewable feedstock business once upon a time, but then we discovered things [such as] oil and natural gas,” Bull said. Nuclear, hydro–, and some non–hydro renewables soon followed, Bull said.
Bull stated that oil is in many ways the biggest challenge. The first tier of countries that have oil are not necessarily friends of the United States. At the same time, the U.S. has a large appetite for oil.
Despite that, Bull said, there is good news. The world energy consumption and energy intensity is getting better. It has been improving at one percent a year, and in the United States, at double that rate. In other words, energy efficiency or energy conservation has grown at about two percent per year. Still, this is not adequate to keep up with our energy demand. The energy goes to several different end users. The end use sectors are industry, buildings, and transportation, which all use roughly a third of the total energy. A certain fraction, about 32 percent, is delivered by way of electricity. The feedstocks are petroleum, followed by coal, natural gas, and nuclear.
Further good news is that renewable energy is on the map, Bull said. It breaks down primarily to about 45 percent hydroelectric, and bit more than that in biomass. In previous years hydroelectric held the number one spot, but it is losing ground primarily because of continuing drought in the west.
The two fastest growing renewable energy sources are wind and solar, even though they are a small part of the total at the moment. The key drivers for renewable energy and energy efficiency are energy security, climate change, air emissions, and then electrical liability.
Public acceptance of renewable energy is large, but public understanding of it isn’t. Solar is tangible. “I think they are starting to get the idea of wind, because wind is growing rapidly,” Bull said.
Bull then talked about the role the five renewable sources—solar, wind, geothermal, hydroelectric, and various kinds of biomass—can play in efficiency. Efficiency should be done first, Bull pointed out, as energy not used is the best energy. Efficiency can and should be a factor in all three end use sectors, Bull stated. Wind is primarily an electricity producing technology, but it can also be viewed as a link to the end use sector through hydrogen. Solar and biomass can also go through the hydrogen route. Solar is likely to be more distributed, whereas wind can be essentially deployed and utilized in larger quantities virtually all of the renewables go through the electricity route to the end use sector. Biomass
electricity is another approach, and it is also useful as a renewable fuel, which adds to its appeal.
Bull named some of the challenges facing feedstocks. One is the anthropogenic production of carbon dioxide that has accelerated in modern day, and the second is the release of carbon dioxide into the atmosphere in the future. “Depending on your perspective, I believe most scientists believe we need to be doing something about this,” Bull said. The population challenge is another important factor.
Bull then went on to talk about biomass. The public doesn’t understand what the term means, he said. In addition to that, an enormous amount of biomass is still put in landfills. “Cardboard, waste paper, grass clippings, broken branches, we put a ton of things in the landfill, that I am embarrassed,” Bull said. Woodchips and forest thinnings are another important biomass resource. Agricultural crop residues are also an important biomass resource. Corn stover is the one program area that the Department of Energy is currently looking at, which also could be combined with the harvesting of corn. Then there are energy crops of various kinds, like switch grass, poplars, as well as alfalfa. “Ultimately, once you use the waste, then you would think about migrating to the use of energy crops,” Bull said.
With the crops at hand, the next step is to think about the equivalent of petroleum refineries, Bull pointed out. All biomass is not equal, and there are various forms of biomass in terms of chemical constituents. Furthermore, every piece of biomass needs to be utilized to make biomass an economic business, and it is expensive to bring it to a processing plant. “We need to be doing the same thing [as] the petroleum folks, and that is get every value … of every product possible out of it,” Bull stressed. The Department of Energy’s program is now focused on this point.
Biomass is the only renewable source that produces carbon-based fuels and chemicals. Non-cellulosic biomass is traditionally starch, corn, wheat, or other starches. Sugar complexes that are easy to hydrolyze are currently used for ethanol. Oils, such as soy or cornoil, are also relatively easy to process and to deal with. This leads to its current use as biodiesel, which benefits from the advantage that a significant element of the current fuel supply is already diesel.
Transitioning to biomass fuels is another challenge. The transition will either effect a change in engine type or a change in the fuel supply.
Then there is non-cellulosic biomass with proteins that can be used for soybean meal and other chemicals and materials. Byproducts from corn meal, for example, are animal feed and other food products. The processing of glucose is as follows: hydrolyzation of glucose, before its fermentation to ethanol.
For cellulosics the constituents are both an opportunity and an in-
tense challenge, Bull said. There are three main constituents, the first one being lignin, which makes up 15 to 25 percent of the biomass. Lignin contains aromatics, but since they are oxygenated, they may not be easily developed. Currently there are thoughts about just using it in a fuel, or gasifying or pyrolizing it. Ultimately, though, it should have a greater value as a chemical constituent. Hemi-cellulose is another component, making up 23 to 32 percent, and is made up primarily of 5 and 6 carbon sugar polymers. It is easier to hydrolyze. Six carbon sugars are easy to deal with but five carbon sugars are a greater challenge. The cellulose component is up to 50 percent. It is a polymer of glucose, and is generally hydrolyzed by using enzymes, which then makes it easy to ferment to ethanol. One participant pointed out that a number of technologies that are needed to process lignin or cellulose do not exist, and that there is a big need for R&D.
But fuels cannot be the only focus of these processes. The Department of Energy’s programs primarily concentrate on ethanol, electricity, and possibly heat. For this to be an economically viable operation, products in the form of chemicals, materials, food, feed, and fiber have to be developed.
Bull named some examples for producing by-products. There is a partnership with Dupont that is working on integrated quantitative biorefining. This process uses not only the starch, but also the cellulose, the corn, and the corn stover. It produces chemicals, bioethanol, and power, and it feeds the production of Dupont’s Sorona® polyester.
Another area is the forest, pulp and paper industry. Bull said this industry should think about producing not only pulp and paper, but also ethanol and other chemicals by stripping the hemi-cellulose away and using cellulose for the pulp manufacturing. The lignin and other residuals can be subjected to a thermochemical process, and a variety of chemicals can be produced from there.
Bull said industry in general does not associate the word biomass and its possibilities with their products. The challenge is to make their industry more viable and at the same time, really start up this business of a biorefinery. Genomics, proteomics, and bioinformatics are going to be key technologies in the biomass industry, Bull said. It may be new chemistry, but it is related to chemistry.
One of the areas that need to be worked on is producing chemicals to subsidize the ethanol production. The projected cost today to build a plant would be on the order of $2.50 a gallon, compared to corn ethanol at $1.20. Bull said a cost reduction is feasible in the future through improvements in technology and processing. Adding chemical products and materials would bring cost down at a more rapid rate.
While it is easy to think of biomass as the primary alternative feedstock, water could also fit that category. Hydrogen does not exist in na-
ture nor can it be mined. However, it can be obtained from fossil energy. Hydrogen is embodied in biomass, and it is also in water, but energy is necessary to obtain it. This energy has to come from somewhere else, like nuclear, geothermal, solar, or wind. There are two key routes: One breaks down the hydrocarbons to elemental hydrogen and usually CO2. A cleaner and neater way is to generate hydrogen from water, for example, by electrolysis.
There are several ways to produce hydrogen, Bull said. But if it is derived from natural gas, coal, gasification, or other forms, then all the problems that conventional fossil fuels have are not avoided. However, there is the option of using biomass and reforming it. “You can gasify it, pyrolize it, and get there,” Bull stressed. Photo-optochemical water splitting with direct sunlight is another possibility. There is enough energy in sunlight to split water, but lakes bubbling hydrogen do not exist. This is because the water is transparent to the sun, and there is not enough absorption. There are two ways to bypass this problem. One method is to use a combined photovoltaic system and electrolyzer as a single device. By doing this, the system can be at least 30 percent more efficient than a two-step system in which a solar panel produces electricity to electrolyze water to hydrogen. The second method employs photosynthetic algae. The idea would be to have ponds of algae to efficiently absorb sunlight and manipulated to produce hydrogen.
Klaus Lackner raised the possibility that competition for agricultural land would arise between growing biomass and feeding the world’s population, which may reach between 10 and 20 billion people. This is especially pressing in the light of the fact that many people in China and India move away from these diets toward ones which involve more meat.
Stanley Bull replied that the U.S. economy could handle that, as the food supply is adequate. He said that was why there is significant attention on cellulose parts, because they may in principle not compete head-to-head with fuel. Switch grass can be used on marginal lands; it can grow and sustain itself and maintain productivity for up to 10 years. Food would always be provided first, because that is going to be more economical, Bull said.
One participant asked if municipal waste contains many contaminants that may be difficult to deal with like PVC or fluorescent light bulbs. Trash that has been separated works best, Bull said. “So, I think that is really the best answer, is to do as much source separation as you can,” Bull stressed. “That is why the public doesn’t like it, because you put it through an incineration plant and you have got stuff that is really hard to control coming out,” he added.
Valerie Thomas asked if it would be effective to have the kinds of subsidies that exist now for corn ethanol and for some of the other fuels.
Bull said without some significant advantages, like used equipment or free feedstock, a plant could not compete. He said the price drop in enzyme cost has helped to reduce the total cost. Technology is important, the industry will need incentives, things like the production tax credit, Bull pointed out. He gave wind as an example. Electricity can be produced from wind at five cents a kilowatt hour, which is competitive but the utilities want an additional margin to feel comfortable in putting their future in that technology. The 1.8 cents per kilowatt hour incentive fills that gap, Bulls pointed out. When that production tax credit is in place, the wind industry booms, but when it is in a hiatus, everybody stops. ”Of course, part of the reason they stop is, they think the tax incentive is going to come back, so why go build something until it is going to come back. It is a chicken and egg thing a little bit,” Bull said. Translating this to the biomass world would mean that more tax incentives will help stimulate the industry. “Our whole target is to get it to be cost competitive without tax incentives of any form, but we need it today,” Bull said.
James Wishart mentioned that one of the other potential competitors to hydrogen as a basis for a fuel economy is methanol, which has been advocated by some people, including George Olah. He added that while the route to methanol from biomass is more difficult than with ethanol, methanol could be used in a closed cycle for forming fuel by photoelectrochemical or photochemical activation of CO2. Methanol has the advantage of being a liquid form of chemical energy as opposed to hydrogen, which is a gaseous form, and difficult to store. Methanol can be obtained by gasification and doing methanol synthesis from that. There are also researchers working on CO2 catalytic conversion to methanol, but thus far, no results have appeared. Unfortunately, methanol is not as friendly a fuel from a toxicity point of view as ethanol.
There is a problem with too many fuels that are not compatible or immiscible, because infrastructure for everything is not affordable. “The problem is, who is going to play God and decide which one of them we are going to focus on. Right now, the President is saying it is hydrogen,” Bull said.
Richard Wool said that to meet 100 quads of energy in the United States in terms of solar energy, a piece of land 100 miles by 100 miles would have to be covered in today’s solar collectors. Solar is at least twice as expensive as gas. However, as niche applications are used more and more, and manufacturing is geared up, the costs will continue to come down and “one day we will all be using solar,” Bull predicted.
SUSTAINABLE FUELS AND CHEMICALS
Mark Holtzapple talked about the MixAlco process. The process fixes solar energy in the form of biomass and then runs through a biorefinery.
The refinery produces alcohol fuels, which are burned. This releases carbon dioxide, which is fixed into the biomass through photosynthesis. This cycle doesn’t add new carbon dioxide to the atmosphere. As long as the sun shines, there will be a perpetual source of energy, Holtzapple said.
Holtzapple proposed ideal features of a biomass process. As a source, any feedstock, such as trees, grass, agricultural residues, energy crops, municipal solid waste, sewage sludge, and animal manure could be used. The alcohol potential from waste biomass is about 135 billion gallons per year in total. Gasoline consumption is 130 billion gallons a year, and diesel 40. This means that alcohol from biomass could have the potential to replace a significant amount of our liquid transportation fuels.
Ideally, high productivity feedstocks could be used. Right now, corn is the source of biomass and it only produces about 3.4 dry tons per acre per year. Compared to alternatives such as sweet sorghum and energy cane, it is not a productive crop.
Ideally, farming would be an economical activity. Right now, a farmer sells corn at $2.40 a bushel, which generates $340 per acre as gross income. If they were to grow sweet sorghum at $40 a ton, they would double their income. If they were to grow energy cane at $40 a ton, they would triple their income. Furthermore, the environmental impact, in terms of water, fertilizer, pesticides, herbicides, and soil erosion would be lower.
Holtzapple then talked about aquatic biomass, which is phenomenally productive. In addition, water hyacinths, for example, are a very beautiful crop. If they are grown without any CO2 enrichment, the yield is up to 70 dry tons per acre per year, which dwarfs energy cane. Enriched with carbon dioxide the yield can be 100 dry tons per acre per year.
Next Holtzapple described the ideal refinery. Ideally, it would not be sterile, because it is very hard to maintain sterility on a large industrial scale. It would be preferable not to use genetically modified organisms because of public concern, which would be costly due to disposal methods that prevent their release into the environment. The process should be adaptable to varying feedstocks. Ideally, it would not need enzymes or vitamins, because they incur costs, and there would be high product yields. The process should not be based on the economics of co-products, such as proteins. Currently, corn–derived ethanol has a protein byproduct that is responsible for about a third of the income.
Lastly, the fuel itself has to be competitive in terms of the octane rating, the volatility, the ability to ship it through pipelines, its energy content, the heat of vaporization, and whether it will damage the ground water if accidentally released. MTBE is problematic, but ethanol is considered to be a good alternative. It has a very high octane rating, and it is considered an innocuous chemical, a highly desirable characteristic if it is accidentally released into the ground water. However, it has some major
drawbacks. For example, it raises the vapor pressure of gasoline. Also, it cannot be shipped through pipelines since it can pick up water and causes problems when used in downstream applications. It has a fairly low relative energy content (about 84,000 BTUs per gallon), and there can be cold start problems with pure ethanol. “What I would suggest is that mixed alcohols are superior on all of these various features,” Holtzapple said.
The MixAlco process produces a product that has all these ideal features, Holtzapple argued. He first described the process in its original version. First, the biomass is treated with lime. This treated biomass is then fermented with a mixed culture typically derived from soil. “[L]iterally, you just throw dirt on the biomass and let it rot,” Holtzapple said. What comes off the rotting biomass is a carboxylate salt, and some calcium carbonate is added to neutralize the acids. Acetic acid, propionic acid, and butyric acid are then added during the rotting process. The carbonate reacts with the acids to make calcium acetate, propionate, and butyrate. The water is removed from the salts, which are then heated to produce ketones such as acetone and, if hydrogen is added, alcohol, such as isopropanol. “We are turning cattle manure into salts of vinegar, nail polish remover, and rubbing alcohol,” Holtzapple said.
Holtzapple said he believed the hydrogen economy could viable through the MixAlco process, because this process hydrogenates biofuel. Biofuel is a hydrogen carrier, and it converts the energy of the hydrogen into a liquid transportation fuel that is completely consistent with the current infrastructure. As much as 35 percent of the energy content of that alcohol can come from the hydrogen. This means it is a very safe, reasonable way of getting to the hydrogen economy, Holtzapple said.
To assess how well the technology works Holtzapple uses an in situ digestion method. Scientists take about two grams of biomass and put it in a container resembling a tea bag. The tea bags are put into a porous sac, and the porous sacs are placed into the rumen of a cow by way of a surgical hole called a fistula. The bags are incubated, washed, dried, and weighed. Digestion yields are measured based on the mass of biomass remaining. For example, if the experiment starts with two grams and one gram is left over, it was 50 percent digested.
More recently, Holtzapple has come up an advanced lime treatment process. For this, he built a big pile of biomass with lime in it and then aerated it to get the combined effect of oxygen and lime. The addition of air is a necessary step, for without the air, a third of the lignin will not be fermented. The next step is the fermentation. Organic acids naturally occur everywhere, including within the rumen of cattle, sheep, deer, and elements, sewage digesters, swamps, and termite guts. Nature favors the production of organic acids, because the process is thermodynamically driven. As a result of the fermentation, acetic acid as well as propionic
and butyric acid are produced. Methane can also be produced, but Holtzapple adds an inhibitor that blocks methane production. As a result, the energy that would have been lost to methane is accumulated into the higher acids.
The product distribution is a function of the temperature. At mesophilic conditions—at the temperature of a cow—the distribution is about 41 percent acetic acid. If the temperature is raised to 55, the process highly favors acetic acid production. The process utilizes a proprietary marine organism that is resistant to the high salt concentrations. The microbe is used in the inoculum, leading to high product concentrations.
The grand vision is to build the pile with biomass, lime, and calcium carbonate. During the first month or so, air is ventilated up through it to remove the lignin and make the biomass digestible. A tarp cover is then thrown on top, the microbe-infused dirt is added, and the pile allowed to rot and produce organic acids. The next step is the de-watering step. Vapor compression is the preferred de-watering method, since placing a vacuum on a salt solution will cause it to boil, producing distilled water and concentrated salts.
The thermal conversion step is next. The calcium acetate becomes acetone, and the higher acids become higher molecular weight ketones. A 99 percent conversion at about 440 degrees Celsius takes about 25 minutes. Finally, the hydrogenation step takes place, in which the ketones become the corresponding alcohols. This step uses a Raney nickel catalyst in liquid phase. The process is geared toward ethanol at the moment, as that is where the tax credits are. The calcium acetate solution is run over several steps to produce ethanol.
Holtzapple then introduced the economics of the process. The ketones sell for about 60 or 70 cents a gallon, providing a substantial profit margin at the moment, as the current price of acetone is $3.50 a gallon. Alcohols are around 80 cents a gallon. The acetic acid, for example, is in the six cents a pound range, and it sells for about 25 cents a pound. The alcohols—in this case mainly ethanol—are in the 60 to 70 cents a gallon range. These prices are calculated for a corn price of $40 a bushel.
A possible scenario is taking the sugar cane, extracting the sugar, and then producing alcohol from the fiber. The sugar could be used to make a range of products, such as biodegradable polymers, rubber, or even fibers such as Dupont Sorona®.
Holtzapple then named the factors for replacing gasoline. With current engine efficiency, 248 biorefineries would be needed, and a land area of 300 by 300 miles would be required for sugarcane. At double the efficiency, the required area is reduced to 200 by 200 miles. At triple the efficiency, it shrinks to 174 by 174 miles. In comparison, the land requirement for sweet sorghum would be 340 by 340 miles. Holtzapple is currently
testing an engine that could be 49 to 55 percent efficient. This translates into a gas efficiency of 75 to 100 miles per gallon in a full sized car.
One participant wanted to know the time scale for realization of the project. Holtzapple estimated that a small demonstration plant should be running by the end of 2005; a full scale commercial plant might take two or three years.
Another participant asked if sustainable agricultural practices were part of the project. Sugar cane has been growing in Cuba on the same land for hundreds of years, and the land still maintains its productivity. “So, if you do it right, the soil doesn’t wear out,” Holtzapple said.
Frank Flora asked how the process is controlled. Calcium carbonate is self-buffering, it maintains a pH range of 5.8 to 6.2.The temperature is controlled by circulating liquid through the pile, which can then go through a heat exchanger. “[A]ll the control that we need is just temperature, pH, and keep that inhibitor in there so that we don’t degenerate to methane,” Holtzapple said.
ROUTES AND COMMODITY CHEMICALS FROM RENEWABLE RESOURCES
Douglas Cameron spoke about Cargill’s work in the area of green chemistry and sustainability. He also talked about two carbohydrate-based examples of the chemistry bio-refinery, lactic acid, and 3-hydroxypropionic acid (3-HP).
Cameron predicted that biomass feedstock prices will continue to drop in real terms, due to improvements in agronomic practice and in biomass conversion technologies. Process technology has to continue to evolve in order for this to happen, and government funding is needed to help spur this along. Novel catalysts are going to be important, he said. Product development and selection of the right products are important, as are platform chemical concepts and partnerships between companies like Cargill, who understand agriculture and chemical companies.
Cameron then introduced some examples. He started with the production of polylactic acid (PLA). The concept is to make lactic acid by fermentation of sugars, which is extremely attractive. It is then chemically converted to lactide. Starting mainly with the L,L form, the product is primarily L,L lactide with a small amount of the D,L lactide and an even smaller amount of the D,D lactase, which can be removed by vacuum distillation. The polylactic acid is recovered through ring opening polymerization.
Cargill now has a large production plant, where it makes various types of fibers, clothing, and other applications that are being developed, such as carpets and tile squares. PLA has the property of dead fold, so it
can be used in wrapping, typical biodegradable type applications, and bottles. Recently, a company in Colorado started selling bottled water made out of PLA bottles.
Cargill is now the biggest producer of lactic acid, and its process utilizes a wide range of renewable sugars, including glucose. Lactic acid produced by an anaerobic fermentation, which means it is not energy intensive since oxygenation is one of the most energy intensive parts of fermentation. The theoretical yield of lactic acid from glucose is 100 percent, which compares favorably with the theoretical yield from other glucose-derived products, such as ethanol (51%). There are two functional groups in lactic acid, which means that it can act as a platform molecule for other chemistry.
One potential product from lactic acid is hydroxypropionic acid, which may primarily be 2- or 3-hydroxypropionic acid. This acid is not widely available through the chemical industry, not even from Sigma Chemical or Aldrich Chemical. Since it is not readily available, its possibilities have not been extensively explored. Cargill also determined that no microorganisms had previously been used to make it.
This case exemplified the concept of green chemistry design that potentially looked attractive, Cameron said. However, in practice, there were many unanswered questions. Cargill evaluated hydroxypropionic acid in terms of its properties and its safety. Like lactic acid, it could act as a platform chemical; for example, it could be oxidized to malonic acid, a pharmaceutical intermediate currently manufactured through a chemically messy route. It could also lend itself to polymerization and hydrogenation to 1-3 propane diol, which would provide an alternative route to make a chemical ingredient for the Dow product Sorona®. Hydroxypropionic acid could also be dehydrated to acrylic acid, which is a seven billion pound product with a rapidly growing demand; it is the basis of super-absorbent polymers used in diapers and other products.
What this could potentially do is give Cargill a renewable route to a compound like acrylic acid and a whole host of other materials. Since it had not previously been investigated, Cargill researched some of its applications. One of the things the company found is that the calcium salts of 3-hydroxypropionate are very highly soluble relative to many other organic acids. This provided an opening for unique opportunities for this acid in cleaning applications such as removing “scale” (substance buildup) from various materials. In addition, the corrosiveness of hydroxypropionic acid is much lower than that of a number of other acids, making 3-HP or its salts a potentially good antifreeze agent.
While Cargill now saw opportunities with hydroxypropionic acid, there was yet no microbial method to make the chemical. Cargill wanted to determine if they could design and build metabolic pathways to do
this. Criteria for the process included a 100 percent theoretical yield, the availability of necessary genes and enzymes, and several other design factors. To begin, Cargill mapped every possible metabolic pathway from glucose to 3-HP and found the most intriguing one to be the beta alanine pathway. This route only requires a microorganism to ferment glucose to pyruvate and an additional four steps to make beta-alanine directly from pyruvate. These additional steps could potentially be engineered into the microorganism.
How does the pathway compare to the criteria? Energetically, it is very favorable. The theoretical yield of this pathway translates into one gram of 3-HP per gram of glucose under anaerobic conditions, with identical thermodynamics to those of lactic acid production. The one major challenge in the pathway is that it requires a catalytic step that doesn’t exist in nature. However, enzymes exist in nature to convert alpha-lysine to beta-lysine.
Ideally, the pathway could be transferred into an organism like E. coli and the various byproducts could be eliminated. “So, you would have an organism that makes 100 percent 3-HP from glucose, and that is what we are proceeding to do,” Cameron said.
Vegetable oils are another important research area for Cargill. One of the current business divisions is involved with industrial oils and lubricants, and they study how to produce hydraulic fluids, transformer fluids, and various other codeines and things made from vegetable oils. For example, vegetable oil is used to make biodiesel in Europe, and a byproduct of biodiesel is glycerol. “So, there are opportunities to figure out what to do with that,” said Cameron. He also mentioned that Cargill has a very active research program in polyurethane polyols, and there are some compelling reasons to make polyols from vegetable oils. For example, polyols are used for the production of polyurethanes for automotive seats.
Cameron said Cargill uses fatty acids and performs various chemical reactions with them, such as metastasis chemistry (the interchanging of double bonds). This is done in partnership with a company called Materia. Dienes for rubbers, or alpha olefins for polyethylenes, can be made from unsaturated fatty acid starting materials.
When asked about toxicity studies on 3-HP acid, Cameron said the data shows lower toxicity than lactic acid both in fathead minnow aquatic toxicity tests as well as in various animal feeding studies. However, the concern is that it is quite easy to dehydrate 3-HP acid to acrylic. He added that the chemical is a component of what is called the 3-hydroxypropionate cycle, which is present in many bacteria species, leading researchers to think that it could be rapidly metabolized in nature.
SUSTAINABILITY, STOICHIOMETRY, AND PROCESS SYSTEMS ENGINEERING
The chemical industry has been driven by material substitution for decades, Jeff Siirola said. Products are mostly made from methane, ethane, propane, and aromatics. Eighty percent of all manufacturing energy, and 80 percent of all manufacturing wastes, are associated with the processing industries. Today, even though the chemical processing industries are considered to be energy intensive, energy cost has not actually been a dominant economic factor, not even in distillation processes. Siirola said that in his company products that cost more than 5,000 BTUs per pound are not made. With coal priced at 80 cents per million BTUs, energy does not comprise more than five percent of the sales price, which means that it rarely costs more than two, perhaps three cents per pound of product.
The cost of energy has gone up by an order of magnitude, but the relative ratio of energy to capital is unchanged and has remained so for the last century. But sometimes, certain sources of energy change in price relative to others, Siirola said. For example, the price of natural gas today is really more expensive relative to coal.
There will be an increase in the global demand for energy, Siirola predicted. The increase in the world’s population to between nine and ten billion people will increase the GDP (gross domestic product) by a magnitude of six to seven times. At the same time, most commodities, most materials of construction, and the chemical industry will increase by five or six fold. The energy demand is expected to increase by a factor of about three and a half, which translates into a growth rate about half that of GDP growth. This growth rate has remained fairly constant for the last several decades due to technological innovations.
According to Siirola, the imperative of not harming the environment and not increasing the impact on it would be the basis of his discussion. He then asked what it means to not limit the choices available to future generations. He discussed how a choice should be made between two different sources of a raw material or energy with two different costs for the chemical industry. “Should I only use the more expensive one and leave the cheaper one to future generations, [or should] I extract from both sources in the same percentage that I find them so that I leave the future generation the same problem that [I’ve] got?” Siirola asked. He argued that the cheaper one will be exploited, resulting in a price rise and leaving the future generation to fend for itself. “No customer will pay extra to exploit something more expensive so that future generations are left with a cheaper alternative,” he said.
Siirola then spoke about raw material selection for the chemical industry. Selection characteristics or criteria include factors such as avail-
ability, concentration, extraction cost, competition for the material, other alternatives, and how close the raw material is in chemical or physical structure and in oxidation state to the product.
The oxidation state of material is a particularly important factor, especially for carbon. Carbon has eight oxidation states, which range from –4 (methane) to +4 (carbon dioxide). Free energy changes between [e]ach of those states are almost exactly equal, amounting to about 25 kcals per milliliter per oxidation state. Most of the polymers in the world lie somewhere between the –2 and –0.5 state, and most oxygen organics exist between –1.5 and zero oxidation state. Methane is –4, ethane is –3, many hydrocarbon type materials are –2, and many oxygenates are a little bit higher, with acetic acid at zero. Carbonate salts are found at the bottom of this chain. Carbon in the +4 state is acidic and can be neutralized, and its salt has a free energy state some 40 kilocalories per milliliter lower than that of the free acid.
Siirola then counted up the available carbon on earth. There are about 75 gigatons of carbon in natural gas, 120 gigatons of carbon in oil, 900-plus gigatons in recoverable coal and about 250 gigatons each in oil shale and tar sands. There are about 2,500 gigatons that are recoverable at prices higher than today’s prices. This means tighter oil or tighter gas supplies, as coal seems currently too thin to mine. Carbon is also found in methane hydrates with an estimated tonnage range from a few thousand to a number so large that they could not have been generated biogenically. The total biomass on the top of the earth is 500 gigatons, and the amount of carbon located in the soil one meter below the surface, plus peat, is four times as high. The total amount of biomass produced each year on the earth is 60 gigatons, and an equal amount is destroyed. More than half of that amount is located in the tropical rain forest and the tropical savannah. The total amount of chemicals produced in the world is three tenths of one gigaton of carbon.
Upon looking at oxidized carbon atoms, the total amount of CO2 in the atmosphere at today’s concentration supplies 750 gigatons of carbon. There is twice as much coal as there is biomass, and there is more coal than there even is CO2 in the atmosphere. There are approximately 40,000 gigatons of carbon that exist as dissolved carbonates in the ocean and an additional 100 million gigatons of carbon found in limestone, chalk, and dolomites.
In sum, this means that the world is not running out of carbon atoms, Siirola said. Many of them are in a reduced state or in their oxidized state. If the raw material for carbon is in a lower oxidation state than the product, it has to be oxidized. This may occur through direct and indirect partial oxidation. The oxygen source is almost always atmospheric oxygen, and the reactions overall are oxylformic. Selectivity is sometimes an issue,
purification is oftentimes an issue. Sometimes a proportionation reaction will form the desired oxidation state. The carbonaceous material is oxidized and at the same time co-produces hydrogen. Often, almost all the hydrogen produced is used, often for environmental purposes, for example to remove sulfur from raw materials. In carbonylation chemistry a molecule is oxidized to carbon monoxide, which then reacts to the desired oxidation state, and this is the only oxidation that is very easy to reverse.
If the raw material is at a higher oxidation state, it has to be reduced. The reduced state ultimately produces hydrogen. Hydrogen production and the reduction reaction frequently occur in tandem and are net endothermic. This means energy has be added to reduce carbon. Starting with an intermediate oxidation state, a disproportionation process can be a good choice. It drives part of the carbon into a lower state. Fermentations, for example, work that way. Glucose will make ethanol at the -2 state while oxidizing an equal number of carbon atoms to the +4 state to produce energy.
In the total synthetic reduction, the energy comes from the sun and CO2 which co-produces oxygen. To make a milliliter of hydrogen, either water has to be split or a carbon atom has to be oxidized by two states. In steam reforming to methane or in biomass gasification, a material at raw oxidation state reacts with water to make hydrogen and CO2. Splitting water into hydrogen and oxygen can be done by electrolysis, and there are experiments to achieve this directly by photosynthesis.
Taking this into account, there is a question of what is the most sustainable raw material for the chemical industry, Siirola said. “Should it be the one that is the most abundant, which means that we would make everything out of limestone? Should it be the one for which a natural process already exists to do some of the energy change that would be required to exploit it, which would be atmospheric carbon dioxide?” he asked. Oil might require the least addition energy to process it into the most final products. Methane or condensate are likely to be least contaminated with unwanted elements like sulfur and nitrogen. Glucose or lignin might be structurally closest to some desired products.
If the world population stabilizes at a number below 10 billion with subsequent GDP growth and energy needs, there will be new energy demands of 2,500 quads. This demand distribution breaks down to one third transportation, one third electricity, and one third domestic heating and industrial requirements, which will mean close to 5,000 gigawatts of new power. This is not a great demand for power, and it would mean one gigawatt power plant every three days for the next 50 years, or about 1,000 square miles of new solar cells at 10 percent efficiency and annual sunlight conditions similar to a cross between those of Tennessee and Nevada. It means carbon emissions climbing from today’s seven gigatons
a year to maybe 26 or 30 gigatons a year. That assumes the same fossil and nonfossil mix and the same mix of fossil fuels. “Actually, if we run out of methane, the amount of carbon [produced] will be even greater than this for the same amount of energy,” Siirola pointed out.
There are 6,000 gigatons of fossil fuels available, counting the biomass, peat in the ground, unrecoverable reserves, but not including the hydrates. At some price more and more—but not necessarily all—will be exploited, growing to three times the current demand of seven gigatons per year. “Will we run out? Yes, at this rate we will, but not tomorrow, [and that] is the point,” Siirola said.
Siirola counted the number of options for sequestering CO2, such as burial or absorption into geological formations, into coal beds, the deep ocean, saline aquifers, or into tight formations, all of which he deemed not entirely satisfactory. There are even fewer options for the energy involved in transportation. In transportation, mass counts, so technology to absorb CO2 on board light weight vehicles does not exist. Even the lightest absorbent known, lithium oxide, is still too heavy to be useful.
Siirola then talked about non-fossil options. Biomass costs too much in terms of cropland. The total world’s net production of biomass is close to 60 gigatons. The total world’s crops currently are six gigatons, and that is likely to grow to nine gigatons as the world’s population approaches nine billion. Most of the cropland will be needed to feed the people. The current energy crops currently comprise a hundredth of a gigaton. The projected energy need will be 26 gigatons per year by the time the population levels out, and the projected chemical needs, at the same time, might rise to 1.5 gigatons per year. However, there are 60 gigatons of net biomass which could be utilized, but it means having to harness half of the annual biomass produced on the planet for energy needs. “So, could you do it? Yes. Are you likely to do it? I kind of doubt it. I just don’t see an infrastructure to collect 50 percent of the biomass,” Siirola said.
If CO2 cannot be captured, fossil fuel cannot be a viable option, Siirola pointed out. Solar power could be a worthy alternative. Compared to biomass, solar power has a much higher energy density. Siirola calculated that the power density that could be obtained by turning biomass into pyrolized coke would be four tenths of a watt per square meter, assuming no energetic cost of growing the biomass. A solar cell on average provides between 20 and 40 watts per square meter, 20 in a place like Tennessee where it is cloudy, 40 in a place like Nevada.
But the difference in capital costs between the two choices becomes an issue. The capital costs for planting hundreds of square miles of biomass could be minimal compared to the capital costs of photo well tanks, which are nearly an order of magnitude higher than the capital costs for making a coal fired power plant of the same production.
Solar energy must be collected and stored. It could be stored in atmospheric pressure gradients as is done for wind. It can be stored in elevation gradients as molecular hydrogen or as carbon in the zero oxidation stage. It may also be stored as latent or sensible heat in thermal storage. “All these things are possible,” Siirola said.
Hydrogen has many advantages, Siirola pointed out. First of all, it produces fewer pollutants and no CO2 at the point of use. But molecular hydrogen is not available, is difficult to store, has a very low energy density, and is an energy carrier rather than an energy source. “If hydrogen comes from reduced carbon, then the same amount of CO2 is going to be produced whether or not we just burn the carbon or make the hydrogen and burn the hydrogen,” Siirola argued.
Solar and nuclear are the only long-term energy solutions, Siirola concluded. “By a factor of 105 there are far more carbon atoms in the high oxidation states than the low oxidation states,” he calculated. The environmental impact of having the rest of them oxidized may be smaller than is currently thought. There is plenty of available carbon in the low oxidation states, and it exists close to the activation state of most of the desired chemical products. In addition, the high availability and the existence of photosynthesis do not necessarily argue for starting from CO2 as the principal raw material as it requires too much energy to be added to the system to obtain the products. “We can, however, get to any carbon oxidation state from any other, but going down an oxidation state costs you energy,” Siirola said.
One participant objected to nuclear being called a viable option. Siirola said he did not spend time talking about nuclear because the risk is a very large number multiplied by a very small number. But he said he suspected that in the end the product is going to be such that it is going to remain among the alternative choices. Siirola said the problems of storage were serious, but not insurmountable, in terms of being able to handle the long-term sequestration of solids. One of the participants added that nuclear looked very much like a green energy.
Klaus Lackner said there might be more carbon reserves than accounted for. He stated that there may well be that there are 20,000 gigatons available. However, that still would not change Siirola’s conclusion. If anything, he argued that this point supports his opinion. Finally, Siirola offered that the amount of oxygen in the atmosphere is a key to estimating how much CO2 was reduced.
Glenn Nedwin talked about how enzymes can help to achieve a sustainable future. “In the last century, we have had the industrial revolu-
tion, and what we believe is that we want to get to the industrial evolution,” he said.
Enzymes are ubiquitous. They are used in detergent, starch, textile, fuel ethanol, pulp and paper, baking, brewing, wine, juice, food specialties, and animal feed. Nedwin listed enzymatic properties that contribute to sustainable development. First, they are biological catalysts with very specific properties that drive chemical reactions in living cells. They work at mild conditions in small quantities and are fully biodegradable. They are made by fermentation from renewable resources, and the excess biomass waste is used as a soil conditioner and fertilizer.
Enzymes can improve product quality and save water, energy, chemicals, and waste while speeding up production processes and enabling new products. Most enzymes come from microorganisms in the soil, with nearly half the enzymes in the world originating from bacteria and the remainder found in fungi.
Enzymes can be tailored to different uses, such as in laundry detergent where the conditions of the water in different places can vary. To find a substitute enzyme, scientists search within organic environments that mimic desired reaction conditions. For example, for cold temperature conditions, an organism that thrives in a cold environment is ideal. If it can’t be found in nature, it is evolved in the lab.
Native enzymes can be altered to change their activity in different areas, making them more alkaline, thermally stable, or pH-sensitive. After an enzyme is discovered, it has to be made at very high levels by large-scale fermentation of microorganisms, which are initially streaked on a petri disk and inoculated into gradually larger volumes of nutrient-rich broth. From these fermentations emerge the purified enzymes: proteases, amylases, cellulases, carbohydrases, lipases, esterases, and oxidoreductases. The challenge lies not only producing them at high levels but also in finding uses for them.
How are enzymes involved in sustainable development? The detergent industry is an important example, as a mixture of proteases, amylases, lipases, cellulases, and manonases is used in detergents. The use of these enzymes has allowed the washing temperature to be lowered from 60 °C to 40 °C. In Denmark alone, this has resulted in a savings of 28,000 tons of coal per year.
Enzymes also help save on the use of other chemicals and render detergents more effective. By using enzymes, fewer phosphates and other chemicals are needed while simultaneously making laundry cleaner. The enzymes are fully biodegradable and gentler to fabrics. If the average washing temperature in Europe is decreased from 40 to 30 degrees, about a third of the household electricity consumption can be saved.
Another application is textiles. The average pair of jeans consists of cellulose fabric woven into denim. Previously, it was dyed but required
additional chemical and mechanical treatments to remove the remaining starch in it and obtain the desired “look.” Historically, oxidizing agents or sodium hydroxide pressure-cooking were needed to get rid of the starch. Now, amylases are added. Cellulases in laundry machines replace the pumice stone that was originally used to make stone washed jeans. The bleaching process now operates with lactase instead of chlorine and other bleaching agents.
To demonstrate that the environmental burden of making the enzyme is outweighed by its sustainable applications, Novazyme performed a cradle-to-grave life cycle analysis. For the analysis, the company compares two systems: a combination enzymatic/chemical system and an enzyme-only system.
Scalazyme, a pectin ligase enzyme that breaks up pectin in cotton, was presented as an example for the life cycle analyses. It was incorporated into the entire process from cotton processing to knitting. First, a scouring step is required to bleach the cotton and prepare it for coloring, a process which normally requires sodium hydroxide, acetic acid, surfactants, energy, and voluminous amounts of water. Novazyme examined this step and found that the scalazyme enzyme could provide a tremendous reduction in the amount of resources that are used in the original process. With the enzymes, the energy demand is reduced by 25 percent, resource consumption is decreased by more than 25 percent, and water use is cut by 65 percent. If the process by which all cotton in Europe is scoured could be converted to this scalazyme process, it could prevent the pollution of enough water to supply 400,000 people. In 2001, Novazyme won a Presidential Green Chemistry Award for the process.
Another example for waste reduction is phytase, an enzyme that breaks up phytic acid in plant material. It can be used to replace the addition of inorganic phosphate to animal feed. Animals fed from phytase-treated feed need less inorganic phosphate in their diet, which results in less phosphate released to the environment. A significant reduction in nutrient salt pollution is also seen. If 23 million pigs in Denmark were given phytase-treated feed, the estimated reduction of aquatic phosphate pollution would be enough to supply an additional amount of potable water for 300,000 people.
Another example is amylase in baking, which is used for extended shelf life of bread. Bread treated with amylase has a modified structure of starch such that it doesn’t recrystallize, effectively preventing the bread from going stale. This reduces waste in terms of transportation and wheat production for the supply of bread. “If this was used in all the white bread in the United States, you would probably save the CO2 effects of 50,000 people,” Nedwin said.
A future use of enzymes could be in the production of bioethanol, which is made from corn starch, and the challenge is to make it this way
as economically as possible. Bioethanol is sustainable and is an almost CO2-neutral energy source. It can replace MTBE as an opting booster in gas. Ten states have successfully banned MTBE, which creates a 1.4 billion gallon per year market right now. Fuel ethanol as a fluid energy source for the transportation sector is the only alternative to gas, except for biodiesel and gas. Today there are blends of 10 percent ethanol, and 20 percent ethanol biofuel is used in Minnesota.
In the year 2000 alone, it is estimated that the ethanol industry added 22,000 new jobs and more than $15.3 billion to the gross output of the American economy. Replacement of all MTBE used in gasoline comprises about six percent of total gasoline substitutions, which amounts to a need for 10 billion gallons of ethanol a year. The current corn ethanol production is about three billion gallons a year, an amount that can only provide ethanol gasoline substitution for 30 percent of all gas in the United States. Furthermore, the total MTBE replacement would consume about 30 percent of the farm land growing corn.
These facts prompted the Department of Energy to fund a very significant research project with Novazyme and Genencor to evaluate alternatives, operating on a budget of about $18 million over four years. Corn is currently broken down by amylases to corn starch and glucose. The alternative to corn-derived cellulose material is the use of corn stover, which consists mainly of cellulose and hemicellulose. NREL worked out an acid pretreatment process for corn stover, which delivers 56 percent cellulose. The cellulose is further broken down by using a mixture of cellulases. Unfortunately, the corn stover/cellulase process is 50 to 100 times more expensive than the corn/amylase route.
There are several different ways of making enzymes less expensive, Nedwin said. These include reducing enzyme production costs by reducing the cost of feedstocks and enzyme recovery processes, by employing onsite production of enzymes where the corn is grown, by increasing the fermentation yield, or by increasing enzyme activity on a program basis.
All factors point to novel enzymes that can be genetically engineered to be tailor-made to specific industrial processes. Enzyme candidates can be integrated into an expression host, characterized biochemically, and tested on the conversion of pretreated corn stover to ethanol. The enzyme cost to make a gallon of ethanol from corn starch is between five and ten cents. The enzyme cost from biomass was $5.40 at the onset of this project, and it is now down to 27 cents,6 which is a 20-fold reduction. The ultimate goal in order to compete with cornstarch is about 10 cents. For this work,
Novazyme received two awards in 2004, the Scientific American 50 award and, together with NREL and Genencor, an R&D-100 award.
Nedwin said pretreated corn stover is about the furthest advancement in terms of technology today. The future needs other pretreatments and mixes of enzymes with different types of substrates. “If we look at making the biorefinery happen in a big way out of biomass, we need government support. If we look here historically, the government has had tremendous help in pushing industry, in the railroads, in the Detroit automotive industry, and even the biotech, by changing patent laws,” he pointed out.
Nedwin summarized the role of enzymes in sustainability. Today, only five percent of fine chemical, polymer, bulk chemical, and specialty chemical industries are impacted by enzyme processes or wholesale microorganisms. McKinsey & Company estimated that by 2010 this number can be up to 10 to 20 percent, which translates into sustainable development to produce less pollution using enzymatic routes. He added that knowledge of enzymology in the chemistry industry needs to be broader, as does the awareness of applications and demand. But in the end, enzymes must be price competitive.
Participants asked about enzymes in nonaqueous environments, as enzymatic hydrolysis is found only in aqueous environments. Glenn Nedwin said there are some breakthroughs being made in that area, such as lipases which work in organic solvents and immobilization of lipases and other enzymes.
William Koros talked about membranes and separation processes and their energy saving potential. He showed that this technology has great potential with the help of some examples; nevertheless, it is nonetheless still underdeveloped.
Separation technology is a possible application to save energy. In the United States, around 33 percent of the total energy use is in the industrial arena, and 40 percent of that fraction is used for separation. This translates into about 15 percent of the total energy use. If this number could be lowered, it would have an enormous impact. This is a huge opportunity, because global capacity will grow over that period of time. If the capacity that is installed today is based on current, largely thermally-driven technology, then membranes will not sustain it as the world stabilizes at a population of 10 billion. “[W]e are buying what we are going to live with in terms of thermal separations if we don’t do something,” Koros said.
Membranes have great potential as energy savers, Koros said. Looking across the separation spectrum, membranes are the low energy inten-
sity enablers that can allow energy conservation in the chemical industry. In an ideal sense, this technology is not thermally driven, but instead mechanically driven. “So, one avoids a lot of the second law restrictions that are currently plaguing separation processes,” Koros said.
But the technology is insufficiently developed. Although membranes have the greatest potential to facilitate low energy processes, they are by far the most immature in terms of technical development. The reason for their lack of application is due to a failure to implement improvements for installation of those membranes.
Koros pointed out that membranes could be used in large-scale processes. Many times people are under the mistaken impression that membranes do not scale, Koros said. Other very selective processes like chromatography and affinity methods do not scale very well, but membranes and adsorption do. In fact, membranes and adsorption are a very powerful “one-two punch,” because adsorption can deal with very dilute solutions, while membranes can deal with more concentrated solutions. In many cases, hybrid systems incorporating these two technologies are very attractive.
There are two fundamentally different kinds of membranes. One operates primarily on the basis of hydrodynamic sieving, which is not actually a filtration process but rather a more subtle process. The size of the rejected entity ranges from 20 Angstroms up to chunks of dust, and it is very easy to shear away and strip away a suspending medium. Usually it is an aqueous organic suspending medium passing through the membrane, and in which some undesirable component is rejected.
Something very different happens when the scale of what is being stripped is on the same order of magnitude as the medium from which it is being stripped. Separation of an aqueous suspending medium with salts or organic components is very energy intensive to separate. In a process called ultrafiltration (or microfiltration), a physical pore is built into a membrane in order to separate particles that are less than 20 Angstroms in diameter. A transmembrane pressure drives the suspending fluid via current flow through the membrane, while the rejected biomolecule is separated because it can’t fit through the pores.
This technology is a very powerful energy saver. As calculated by Koros, capturing a cubic meter of water using this technology costs about 6.7 kilowatt hours per cubic meter (assuming 33 percent energy efficiency) compared to about 73 kilowatt hours per cubic meter using an optimized, fairly efficient triple effect type evaporator method. The real cost, in terms of mechanical energy, is about 2.2 kilowatt hours per cubic meter.
The problems of this process boil down to the need to obtain better control of pore size and uniformity. If the process involves larger solids or more complicated feeds, such as renewable feedstocks, it becomes very
important to control the physical chemistry at the membrane surface. If one takes a step into the next part of that spectrum and considers stripping away micromolecules from micromolecules, or ions from water, it becomes an extremely expensive and energy intensive proposition. Water must want to be in the membrane more favorably than does the ion or the organic molecule being rejected. In addition, once it is in the membrane, there has to be more favorable molecular diffusion process to cause that separation.
While this technology exists as a functional one, it is only known to work well for aqueous systems in a highly evolved state, Koros pointed out. Seawater reverse osmosis is the only well developed example. This is very compatible with wind generation, because it could be off-shore and wind could drive the pump to bypass some of the second law restrictions regarding thermal generation of energy. The energy cost is about 10 times more efficient than that for the thermal option.
This could have implications on a worldwide basis. Around the globe, there are about a billion people who do not have adequate drinking water. About nine billion gallons of water are desalinated every day around the globe, half of which performed thermally in plants that were constructed before membrane technology existed. Due to investment in research, most of the new de-salination plants are now membrane-based. The savings could be about 1.4 quads a year, Koros calculated, which is essentially a payback on the roughly $1 billion of research that was invested in the membrane option over the last 40 years.
There is a whole array of other kinds of membrane applications such as olefin paraffin separations which remove sulfur and benzene from gasoline or isomer separations that distinguish between normal and more bulky isomers. These kinds of separations can be performed but not very well with the current generation of membranes. To have an impact on the energy use, they would have to be as efficient as a reverse osmosis unit.
Koros looked at another example of separation: propane and propylene, a significant market of about 25 billion pounds a year with a growth potential on par with the GDP. Currently, it is a very expensive and energy intensive process conducted through cryogenic distillation. A new unit costs about $50 million, but membranes could cut both energy costs and the capital costs, Koros said. The problem is whether such a membrane actually exists. Propane and propylene are very similar; propylene has a compact and a bulky end, but propane only has a bulky end. The difference in size is about half an Angstrom. “If you simply take a polymer and turn it into a carbon molecular sieve at 500 or 550, all of a sudden, you get an enormous selectivity because it becomes possible to do size and shape discrimination that is simply not possible by a polymer alone,” Koros explained. Tenths of Angstroms can be easily distinguished.
However, cost is still a problem. This process is about a thousand times more expensive per square meter of membrane than the polymer process. “I think the only thing that can be done is analogous to walking on both legs. You don’t count on all your left leg or your right leg. Organics or polymers are very easy to process,” Koros explained.
Both technologies need to be developed. Inorganic- and carbon-based membranes are extremely selective because of their rigid size and shape discriminating ability. There must be investment into the development of the next generation of membranes that retains this exquisite size and shape separating capability. In some cases, organic polymers do fine, such as in reverse osmosis. However, as there is a need for increasing selectivity, the next generation technology is being pushed almost to a pure inorganic glued together by a polymer.
It can be achieved, as early work in the last couple of years has demonstrated. It is possible to put a million of these fibers into a module that is about a foot in diameter and about a meter long with the surface area of a football field. If the hybrid material can be put on the outside of the fiber while maintaining the inside as a flexible material, “it thinks it is a polymer in terms of mechanical properties but, in terms of its separating properties, it thinks it is a molecular sieve, or at least a hybrid material,” Koros said. The idea is to integrate this material into a practical process in which mixtures of molecular sieve entities and polymers are made up as a “dope.” It is spun into a hollow fiber about 200 microns in diameter, placed through fluid exchange, and dried. Instead of a thousand-fold higher cost, the estimate is about $5 a square foot or $50 a square meter.
“This technology is about where I would say aqueous reverse osmosis was in the late 1960’s. It was clear that it worked, but it didn’t work very well, and there still has to be a significant investment made. We are making a decision. We are either going to invest in something that has this ability to cause an order of magnitude reduction or we won’t,” Koros said.
Participants asked if membranes offer an opportunity to separate a molecule the size of water from a 300-, 400-, or 500-gram per mole pharmaceutical, for example as in a waste water treatment facility. This process would be somewhere between a true solution diffusion process and one that has aspects of a filtration process, Koros said. There are membranes in this dimension, and they are not difficult separations in aqueous systems.
Separations for hydrocarbons have the problem that hydrocarbon molecules are dissolved on the surface of the inorganic membrane, one of the participants noted. A major interest is in organic membranes because they withstand very high temperature conditions. There are some thoughts that, by cutting the surface of the membrane material, the ad-
sorption factor could be eliminated. Koros replied that membranes are variations of that idea, which are easier to process. Usually, depending on what the hydrocarbon is, it is possible to have the material tight enough that it won’t allow large molecules in. The membrane already does part of the separation, and it acts as a sort of a raincoat for the membrane that is doing the fine separation. “Now, in terms of being able to use a pure inorganic or carbon, I won’t say that that will never happen. I am afraid that what has to be done is, we need to get onto the field with some technology that actually works, so that people don’t invest in these high energy intensive things,” Koros said.
GREEN CHEMISTRY FOR CARBON MANAGEMENT
Klaus Lackner talked about the challenge of using fossil fuels, a new fuel economy, and how to intelligently and safely dispose of CO2 produced by the world’s population.
The situation of fossil fuels is precarious. If 10 billion people—the potential future global population—start consuming energy at the same rate as the United States, Lackner said that CO2 emissions will lead to climate change on an unprecedented scale. He stated that there is a high likelihood that there will be shortages in oil and gas and pointed out that the global population has been faced with the situation of 30 remaining years of reserves for the past 100 years. “The other unfortunate point is [that] all of that oil is concentrated in the Middle East,” Lackner added.
He said two problems—climate change and the end to global oil reserves—will become acute at the same time, which will occur some time in the middle of the century. He emphasized that fossil energy is absolutely vital to the economy. “It is about 85 percent of the total, and I have a very hard time seeing that, in the short term, it is going to be replaced,” Lackner said.
But the scale of energy consumption is so large that it might be difficult to find alternatives. The three big alternatives are solar energy, nuclear energy, and fossil fuels. Solar energy is not likely to provide a complete substitution anytime soon; only one ten thousandth of the solar energy on earth is being used. Nuclear energy is the other big player. People argue that the uranium reserves might be too limited. However, if the technology were better, it would not be a problem, Lackner said. With current research, there is a decade or two left for the world energy consumption to use fossil fuels. Lackner projected that there will be fossil fuel for the next 100 to 200 years at a price similar to today’s prices. To underscore that point, Lackner recounted South Africa’s situation under embargo, a time when the country still managed to produce gasoline for about $45 a barrel.
From a raw resource point of view, Lackner argued there is no guarantee that fossil fuels will run out in this century. Unfortunately, the environmental havoc will be horrendous if the CO2 problem is not resolved. This means that the fossil fuel cycle has to be engineered; products must be benign and safe, either for use, or for ultimate disposal. It also means that fuel has to be produced from all fossil resources, and advances in gas to liquid and solid to liquid transformations must be made to bring prices down.
Disposing of the carbon dioxide is a major challenge, Lackner said. Lackner stated his point of view that any chemical returned to the environment should be in its ground state. This requires a thermodynamic transformation from CO2 to an even lower state, the carbonate form.
The amount of CO2 in the atmosphere is increasing. In 1800, it held about 550 gigatons of carbon. It is now in the 750 gigaton range. Current consumption of fossil fuels is about 600 gigatons per year, which is equal to the entire standing biomass. But it is not clear if fossil fuel use will increase by three or four times as much in the coming century. Nevertheless, the CO2 output could potentially amount to orders of magnitude that are huge compared to soil and biomass content. It is already large compared to the storage capabilities of the ocean. In other words, there may be 39,000 gigatons of carbon dioxide dissolved into the ocean, but it cannot be removed or added without drastically changing the pH of the ocean. A pH change of 0.3 would be equivalent to roughly 1,200 to 1,400 gigatons of dissolved CO2.
Some hypothetical disposal grounds for CO2 are the ocean, biomass, and the soil. If 30 percent of the ocean’s volume were to be acidified, it would cover some fraction of CO2 disposal, and if biomass could be increased by 50 percent, the disposal of carbon in soil could be increased by another 30 percent. None of these options are ecologically acceptable, feasible, or practical with current technology, Lackner said, and that still wouldn’t come close to covering the emissions if business ran its course as usual.
Even in a no growth scenario, CO2 will be a huge problem. The last 200 years has produced 300 gigatons of CO2, and there will be another 300 gigatons released before 2050. “So, this, in a nutshell, is the problem. The fossil carbon pie, in some sense, is rather limited,” Lackner said.
To cope with these problems, Lackner said all three major energy options had to be left open since current solar capacities could not be counted on as a full replacement strategy. Nuclear energy, on the other hand, is far too complex and too expensive to replace fossil carbon. “We simply cannot abandon in the foreseeable future the one option which currently works, but I don’t want to belittle the problems,” Lackner said.
This still requires massive changes. An entirely new energy industry
has to be built, and CO2 emissions between now and 2050 must be held constant. It also means the establishment of an energy economy at the current, or double the current, size that will not emit CO2 while simultaneously allowing the current CO2-emitting energy economy to exist.
Lackner cautioned against depending on rising efficiency. “If you throw every efficiency and every trick in the book into this game, you might be able to hold things constant until 2050. The problem is, at that point, when the options run out and CO2 levels start rising again naturally, the effect is effectively zero by the end of the century,” Lackner said.
This is why there is a need for new technologies to keep CO2 levels constant. CO2 has to be collected and disposed of at the big concentrated sources in a permanent and safe manner. At the same time, CO2 has to be captured from the air in order to deal with the waste produced by the transportation sector.
Lackner then discussed the options for disposing of carbon dioxide. One of the proposed routes is storing it in the ocean. Since the oceans will simply acidify, Lackner said that idea has been discredited. Furthermore, the turnover requires an 800-year time scale, which means the greenhouse gas problem is merely postponed to be dealt with by future generations.
The second option is to put CO2 under the ground. This is done today, as the United States buries some 20 to 30 million tons of CO2 for the sake of enhanced oil recovery. CO2 can also be pumped on to coal bed methane or into saline aquifers. Most people suggest 300 gigatons of carbon can be stored in these ways. “But by the scale we are looking at, this is not enough,” Lackner pointed out. Furthermore, there needs to be a minimum of 10,000 years in securely storing the carbon.
The third possibility could be going to the thermodynamic ground state. Carbonates are in a lower state than carbon dioxide. Carbonic acid dissolves serpentine rocks, and the serpentine reacts, forming silica and magnesium carbonate. It is an exothermic reaction, producing about 63 kilojoules per milliliter and giving free energy points in the right direction, even with ambient CO2.
But there are still some drawbacks, Lackner said. While the reaction is spontaneous and will stabilize itself, it takes about 100,000 years to occur. As a result, there is a need for an industrial process to bring the reaction time to under an hour. Claiming the rock and dealing with metallics are both affordable. Reclaiming the mine is also affordable, and all this can be done for less than $10 a ton for CO2, but the reaction is simply not fast enough. Using energy would be self-defeating, but at this point, with a 40 or 50 percent energy panel, it can be done for $100 per ton of CO2. That is a factor of three or four higher than the desired cost of about $30 per ton of CO2, “at which point you add maybe two cents to the kilowatt hour and about 25 cents to the gallon a gas,” Lackner said.
There is a need for the right catalyst and the right preprocessing step. Lackner suggested using a weak acid to first dissolve this material, making the magnesium salt, and then switching it to the carbonate to recover the acid. The weaker the acid, the easier it is to recover, but it also corresponds to a smaller reaction rate. Determining how to gain another factor of three to five, perhaps even ten, in this process would make all the difference between success and disaster, Lackner said. From a policy point of view, it is absolutely critical because it signals an open door. “The only one of the methods which… opens the door to the next 100 to 200 years is this dramatic step of forming carbonates,” Lackner stressed. There is plenty of peridotite rock, which contains olivine, serpentine, and magnesium silicates. Oman alone has more serpentine than would be needed to deal with all the carbon reserves in the world, but it is highly distributed all over the world, Lackner said.
The obvious place to store CO2 is in power plants, Lackner said, which could also be hydrogen plants. Since hydrogen is likely to come from fossil fuels for a long time. Lackner estimated a price range in gigajoules of energy for various fuels to back this prediction. At a gigajoule, coal on average usually costs less than a dollar. Oil is $6 per gigajoule at $30 per barrel, and electricity, at five cents, is $14 a gigajoule. If hydrogen is to be made from electricity, the cost will be at least $20 a gigajoule for hydrogen. From natural gas or coal at today’s prices, it is $6. The obvious prediction is that hydrogen will be made from the cheapest source—coal. Hydrogen can also be derived from tar, coal, shale, or biomass, but it is very unlikely in the foreseeable future to come from wind, photo well tanks, or nuclear energy. “Unless you put on your burner one cent a third kilowatt hour, which I think is an achievable goal for photo well tanks in the long term, you cannot make hydrogen from it,” Lackner said.
Bypassing the CO2 problem by using windmills may not be possible. The energy that feeds the wind is about 20 times the energy the world consumes today. It is not clear how much wind energy can be harvested without having an impact on the wind field and thus perhaps on climate. Furthermore, a windmill would require at least 80 square meters of rotor-swept area in order to supply enough energy for a single person in the United States. In comparison, the CO2 output per person would flow through an opening the size of a television screen. Therefore, a device to capture the CO2 produced per person would be a factor of several hundred times smaller than one to collect wind energy for that same person. With the ability to capture CO2 from the air comes the option of either making hydrogen from fossil fuels and collecting the CO2 at the hydrogen plant or running your cars on gasoline and capturing an amount of CO2 from the air that compensates for the emission. In addition, if renewable energy
becomes affordable, it is possible to create synthetic carbon-based fuels from CO2 and H2O by using the energy to reduce carbon and hydrogen.
Even hydrogen might not be feasible. If the cost drops to $30 per ton of CO2, hydrogen will still not be competitive because the distribution system for the hydrogen will be very expensive. If hydrogen is piped from a central power plant which collects its own CO2 to destinations across the country, it will cost a lot of money.
However, the dream of the hydrogen economy is to close the loop and have a renewable energy source to split water into oxygen and hydrogen, giving hydrogen to the consumer who then recreates water. If CO2 can be captured from the air, the same loop is slightly more complicated. The CO2 and hydrogen can be used to run an old-fashioned fissure trough to make gasoline. This means the ability to capture CO2 may actually open doors for carbon in any of its hydrocarbon forms to become an alternative energy carrier. The world may then no longer need fossil fuels if this alternative energy carrier to hydrogen can be used in a vehicle.
The future might hold a spectrum of pure carbon to pure hydrogen and, in that spectrum, there is a fuel of choice that can be oxidized. “At the point where you use it, you make CO2 and water, and you give it back,” Lackner said. In a situation where it is very easy to obtain hydrogen for an application—for example, a city bus in a bus fleet—hydrogen might be preferable. This opens up a whole new chemistry of sorting out what fuels are appropriate for the right circumstances and how many different ones can be supported.
New power plants, recovering CO2, and the chemical transformation of CO2 into a stable deposit, will all open doors. But there will have to be an energy revolution in the next 60 years. “If we did what we did the last 50 years, which was essentially doing the same thing slightly better, and incrementally more and more and more of it, we cannot repeat this for another 50 years,” Lackner said.