Dr. Wessner opened the next panel by welcoming the two speakers, Dr. Lee and Dr. Giesekus, and noting the value of hearing from two such accomplished and yet different institutions as the U.S. National Institutes of Health of Dr. Lee and the Fraunhofer Heinrich Hertz Institute of Dr. Giesekus. The National Cancer Institute was long renowned for its basic research in biomedicine, while the structure and success of the Fraunhofer program had been for some years a topic of growing interest to the U.S. policy community, he said, as a potential model for building partnerships between the public and private sectors.
Dr. Lee said that because he saw few health researchers at the symposium, he would begin with a brief overview of cancer research. Cancer, he said, is a collection of many diseases that share the similar behavior of uncontrolled cell growth. When cancer is localized in a tumor, survivability is relatively high if detected early. However, if it spreads beyond the point of origin, survivability drops. If it spreads, or metastasizes, through the lymph glands or bloodstream, the outlook drops precipitously. Some 90 percent of cancer deaths are caused by this disseminated disease or metastasis.
He then said he would try to explain why he had been invited to meeting whose primary focus was innovation. For one thing, he said, the need
for innovation in cancer research was urgent. Unlike other major disease killers, cancer continues to take nearly the same toll it did in 1950. While the death rate per 100,000 Americans from heart disease dropped from 586 in 1950 to 203 in 2008; from cerebrovascular disease from 180 to 44; and from pneumonia/influenza from 48 to 18, the death rate from cancers held steady at about half a million. Americans now spend $124 billion for cancer healthcare costs.
And cancer is not just a burden for the United States, he said. A study by the WHO indicates that by 2020 cancer would claim about 10 million lives per year, making it or one of top 10 killers. This percentage has increased since 2002, “so we truly need to think differently, which is what we are charged to do in my organization.”
A second reason Dr. Lee had been invited, he said, was that his own center was specifically charged to find innovative approaches to the cancer battle. In addition, he said, it was “possibly the most exciting time to be doing biomedical research.” In 2001, the full sequencing of the human genome was completed, and last year investigators in the United States, in collaboration with partners elsewhere, launched the 1000 Genomes Project to examine gene-by-gene population differences. These and other programs have generated unprecedented amounts of information that is being put into public domain.
Jump-starting the Cancer Research Process
Is this increase of knowledge benefiting the patient, he asked? Maybe, but not fast enough. “So we asked ourselves about 10 years ago whether we could jump-start this process and make it more efficient, to go from ‘turning the crank’ to something different. We reached out to the research community, and used the responses to identify four key needs” that might enable researchers to understand cancer. The first was better standards and protocols, so researchers can speak the same language. The second was real-time public release of data to ensure prompt availability. Third was to structure research teams with multiple disciplines. And fourth was to create an environment conducive to innovation. “If we could do all this,” he said, “we would have the potential to transform cancer, drug discovery, and diagnostics.”
These were daunting needs, Dr. Lee said, but his advisory group reported them directly to the president. The Center for Strategic Scientific Initiatives used exactly those needs to structure its mission—to build “…trans-disciplinary approaches, infrastructures, and standards to accelerate the creation and broad deployment of data, knowledge, and tools to empower the entire cancer research continuum…for patient benefit…”
His group then set out to answer their “big question”: why does cancer research not move faster? A central problem, he said, was that while everyone understood that cancer is caused by an alteration of genes, many had different perspectives on which changes were truly important. “So we said, let’s do a systemic identification of all the cancer genomic changes, repeat if for all
cancers, and make it publically available.” He launched a new Cancer Genome Atlas for three cancers: brain, lung and ovarian. The center collected data on every patient who came through the program and placed the data in the public domain every two weeks—”almost in real time.” This was accelerated by stimulus funds from the ARRA.
Finding an Unanticipated Innovation
When his group first showed their paper, fully characterizing the brain disease, they reported a few unanticipated “innovations”: a possible resistance mechanism inside this disease, discernible only because they were able to review 400 to 500 patient samples. That became a reference set, in 2009, and other investigators compared it to their own data sets. They were able to find in their own groups patients who could perhaps be non-responders to aggressive therapy, thereby excluding them from therapies that wouldn’t help them, and in fact would diminish their quality of life. Most recently, in the brain cancer data set, a new subset had been detected in younger patients; preliminary evidence showed it is possible to better predict the outcomes for this set of patients.
To capitalize on this ability the center was about to launch the Cancer Target Discovery and Development Network (CTD2) to link the cancer genomics they had seen and some existing therapies. The hope, he said—”which was something new”—was that some existing drugs that worked well on some patients could now benefit other patients. “Most important, we will share the analysis, tools, and data in real-time fashion.”
As they gathered data about genes, they asked themselves whether they could do the same with proteins. The answer, from colleagues at the National Academies, was “not yet”—that proteins were not quite ready. “Because of variability in platforms; lack of standards, protocols, and reference data; and no consensus on how to report raw data, proteomic technologies were not yet mature,” he said. So in 2005, the center launched a pilot project called the Clinical Proteomics Technologies for Cancer (CPTAC)—not to bring it to the clinic, but to prepare it for clinic use. Among its accomplishments to date are the first demonstration that MRM (multiple-reaction monitoring) is highly reproducible across multiple laboratories and technology platforms. The center also developed a public data portal for all the individual raw data points so people could share their data. Some unanticipated innovations were (1) a joint development with the FDA of a mock document that would allow a new generation of developers to move more easily through this regulatory boundary, and (2) ability to establish an antibody characterization laboratory that provides high-quality reagents at low cost to the community, posts all characterization data on a public database, and links with industry partners.
As of the previous day (in summer 2011) the center launched Phase 2 of CPTC, using the exact specimens that went through the genomics program and analyzing the proteins of the very same specimens. This data set will also be available on line in real time.
Bringing Nanotechnology to Clinical Oncology
Meanwhile, the center had begun looking for ways to combine the power of nanotechnology with this new perspective. “We kept thinking, could we harness some of this disruptive innovation for clinical oncology.” The great promise of nanotechnology was to use tiny sensors and imaging to detect disease before health has deteriorated, to deliver therapeutics with improved accuracy and efficacy, and develop research tools to enhance understanding of the disease.
In 2004 the center published the first nanotech plan solely directed at clinical utility. Others had used nanotech for basic biology, “but this one was different.” In the first phase, launched in 2007, the center was able to develop about 50 companies through SBIR awards and other mechanisms, file over 200 patents, and start eight to 10 clinical trials from the nanoplatform. In January 2011 they launched their first clinical trial that uses the nanoplatform to diagnose the patient, as well as a nanotherapy to deliver a gene-targeted therapy. The program has now entered its second phase.
Reaching Out to the Physicists
In conclusion, the center has produced a great deal of quantitative and reproducible data at the macroscale, microscale, and finally the nanoscale. The goal now is “to do something more complex; to really understand the system, and try to predict it using all this new data. We thought the best way to do that is to bring in another point of view, and when it’s a hard problem, we always turn to the physicists.” They recruited some 300 extramural physicists to attend three workshops in 2008, and found high interest in looking at this problem differently. The center started a network called the Physical Sciences Oncology Network.
“They came to us and said, you draw these beautiful diagrams of how the signaling networks work, but you never show us what actually happens in time and space; how things move around; how they are subjected to physical laws and forces.” We showed them how we think about metastases, and “they said it may look more like this. They helped us look at it differently: to generate new knowledge, and catalyze new fields of study by utilizing physical sciences and engineering principles to enable a better understanding of cancer and its behavior at all scales. We were not looking for new tools to do ‘better’ science, but new perspectives to do paradigm shifting science that will lead to exponential progress against cancer.”
Dr. Lee said that the center does have German physicists in its network, and always hopes to do science internationally. The network itself is designed to be open and receptive to collaboration and trans-disciplinary teams.
“What we are trying to do,” he concluded, “is to build the infrastructure that can better understand and control cancer through the convergence of
physical sciences and cancer biology. We envision a future where individualized medicine becomes a reality—individualized, targeted cancer care.”
Dr. Giesekus introduced himself as a mechanical engineer who had worked for a few years as a researcher in industry before returning to Fraunhofer. His institute, he said, has collaborations worldwide throughout the broad field of information and communications technology (ICT). A central objective is to extend the institute’s expertise in traditional telecommunications to new business sectors, including biomedicine.
The HHI was founded in 1928 to do research in telegraph and telephone engineering. It expanded after the war, and by 2003 was ready for a new phase of development as it joined the Fraunhofer organization. “Since then, he said, “we have had to focus much more on marketing and earning a proportion of our budget from industry contracts.” One of HHI’s successes has been development of the video coding standard H. 264, which is comparable in importance to the MP3 coding standard developed for audio. H. 264 is the basic coding for HDTV, Blue Ray, and other video standards, and is used by YouTube, iTunes Store, Adobe Flash Player, Microsoft Silverlight, and other streaming Internet sources.
Medical Uses for Information Technologies
With some 400 employees, HHI works in almost every corner of the ICT sector, including multimedia, data processing, image processing, interactive media, photonics, sensors, and data networks. When it became a Fraunhofer institute and shifted toward marketing, its researchers discovered “some raw diamonds that needed only some polishing to move technologies we already had into market. This talk is about how we get basic research results into the market for the medical device industry.”
For the past decade, for example, the institute has studied an ultra stereoscopic 3D display that can be used without eyeglasses. It depends on the ability of the display screen to detect the position of each eye and alter the pixels on the display accordingly; this allows the eyes to perceive a large “sweet spot” and optimal depth “This is something we can use to show a patient an image in the field of medicine.”
In a second example, the HHI collaborates with the Berlin Philharmonic and an automotive industry partner to create immersive media platform with high resolution audio, video, and 3D. This makes use of their own cinema technology, which “has the highest resolution world-wide.” It is already
used for sports events, and HHI is planning to extend it to augmented reality and medical applications.
Another technology developed in combination with high-resolution video and augmented reality is “touchless pixel-precise 2D or 3D real-time tracking.” Almost like a shadow puppeteer, the user’s stretched-out fingers can control and even manipulate various images without touching them. A surgeon would be able to directly control the content of a display by simply pointing in a sterile environment. HHI worked with a medical equipment company to market its technology to clinics and hospitals.
Another example where video processing plays a large role is 3D image processing for biometrics. Usually 3D video has to go through a period of postprocessing, which makes it difficult to use in real-time medical applications. From its long experience creating 3D information from 2D video, HHI is able to use multiple cameras to create perfect stereo impressions in real time, which is useful in many medical applications.
The same HHI department, using the same algorithms, has produced a “virtual mirror.” He described a young woman wearing a green shirt, looking into the mirror; as she looks, the color of the shirt changes; then the colors of nearby art-work change as well. For medical application, such a technology could be used to display a surface such as the outside of the liver; a change in liver health can then be displayed as a different color to aid diagnosis or research.
A Technology to Detect Both Plagiarism and Cancer Cells
The HHI has been working for many years on a technology that detects plagiarism, either on the Internet, with IP, or elsewhere. They are able to detect similar pictures, or sift through a huge set of pictures to find the same cars, people, or other defining “fingerprint.” Such a technology might be applied to cancer detection, said Dr. Giesekus, if the algorithm can be used to identify telltale cancer indicators in a patient’s cells.
The HHI is also working on a medical application of a technology that stabilizes video processing. Researchers have developed a core technology of high dynamic range (HDR) video that, like Photoshop, can recalculate the light or dark qualities of a picture. A problem in applying the same technology to video, he said, is that the dark and the light pictures are not taken at precisely the same time, so motion distorts the image. HHI engineers have learned to eliminate the motion, a technique that could be used to stabilize the images of an endoscopic procedure or other operation in the body.
Dr. Giesekus mentioned several other technologies with possible medical applications, including optical sensing at the nanoscale to identify dangerous bacteria or explosives; tiny fiber sensors, now used to measure stress and strain in a building, to control the position of a colonoscope or other instrument; cheap, handheld terahertz imaging devices to detect cancers, cavities in teeth, explosives, or other abnormalities.
“Our wish now with this technology is to find more partners in the fields of research that can use our technology,” he concluded. For this reason, HHI opened a new office in Boston in 2011. “We are always looking for industry partners and research institutes,” he said, “where our technology from telecommunications can be applied to other uses.”
Ambassador Murphy asked the speakers how they were able to select new areas for application of existing technologies. Dr. Lee said that his initiative was now known as “physical sciences oncology,” and this had opened the door to mathematicians, computational scientists, and others. “The network itself is very open,” he said. “There are pilot funds that the centers themselves solicit, so we ask them to be innovative clusters on their own. They bring in new people and new ideas. We have a grand challenge where all 12 centers work together to bring in a grand challenge and new expertise as well.” Dr. Giesekus said that in the HHI, almost all the scientists are physicists, with backgrounds in telecom. These people, he said, “are our core knowledge. Now we are looking for new applications, so we need collaboration. So we bring together a few research institutions, and a few companies, and the company keeps us on track because it has to earn money. So the Fraunhofer model keeps our eye on market solutions, while the solutions need the science to be successful.”