In a relatively short time, the connecting of just a few computers in the early 1970s has become an Internet that billions of people can tap into anywhere, anytime. Over the past 10 years, this connectivity has exploded into an era when nearly any device can be Internet-enabled. In a framework known as the Internet of Everything, we are now connecting not only computers and people, but also our phones, wearable technology, and home devices such as lightbulbs and thermostats. As we look to the wireless future, government investment will be key to developing new technologies to redesign cellular networks, overcome limitations on bandwidth, and advance the sensor technology that will pave the way for networked sensors in the body, increased automation of cars, and other applications that cannot yet be predicted.
The government’s investment in research, early implementation of networks, and collaborations with both academic institutions and industry were instrumental in bringing about what we know today as the Internet. In this chapter, Internet pioneers share the stories of innovation in three key areas: Vint Cerf reflects on the emergence of the Internet; David Culler describes the integration of the Internet into the objects that surround us; and Andrea Goldsmith shares her perspective on the past and future of wireless technologies.
Vint Cerf, vice president and chief Internet evangelist at Google, focused his presentation on the government’s central role in collaborations that led to the creation of the Internet.
Known as one of the fathers of the Internet, Cerf has had a front-row seat and played a leading role in the Internet’s creation and has remained at the cutting edge of networking innovation throughout an illustrious career.
The history of the Internet is a clear demonstration of the crucial role of interplay among government, researchers, and industry in breaking new ground for IT advances and applications. “Every sector in our social and economic system has been engaged and continues to be engaged in the Internet,” he said, adding that “we have managed to mutually reinforce the interest, capacities, and capabilities of many different parts of our social and economic system to keep the Internet growing and going.”
Cerf began with a history of the Internet’s most direct predecessor, the ARPANET, which connected academic institutions funded by ARPA (now known as the Defense Advanced Research Projects Agency, or DARPA) to conduct artificial intelligence and computer science research. A primary impetus for its development was to allow collaborating institutions to share computing capacity and co-develop software.
An early collaboration between the Lincoln Laboratory at the Massachusetts Institute of Technology and the Systems Development Corporation led to a groundbreaking test around 1966 that demonstrated the potential for two separate computers to exchange blocks, or “packets” of information. About 2 years later, ARPA sent out a request for quotation to build packet switches, called interface message processors (IMPs), for the ARPANET. Bolt, Beranek, and Newman (BBN)—a technology company in Cambridge, Massachusetts—won the contract. As a result, BBN contributed a key industry component to ARPANET’s evolution: Bob Kahn at BBN wrote Host to IMP Specification 1822, describing how to implement an interface that lets host computers connect to the IMP; this specification was subsequently made available to the academic participants in the ARPANET project. In today’s Internet terminology, the IMP served as the ARPANET’s “router,” a system for orchestrating the exchange of packets of information between networked computers.
Nearly all of the foundational technologies underlying ARPANET were developed not by one person or organization but by an interdependent, collaborative network of academic, industry, and government experts and experimenters. The development of the host protocol, for example, which allows networked computers to recognize one another’s identities and locations, was led primarily by Steve Crocker at the University of California, Los Angeles (UCLA), with early-stage involvement from the University of Utah and other institutions. In recounting this story, Cerf pointed out that the development of the ARPANET and subsequent Internet was not purely a U.S. pursuit. Several foreign visiting scientists at UCLA contributed to the early phases of host-to-host protocol development.
In addition, the development of Telnet, an important remote access protocol, resulted from collaborations involving UCLA, Stanford Research Institute (SRI), RAND, BBN, and the Massachusetts Institute of Technology (MIT). Development of the File Transfer Protocol was led by Abhay Bhushan at MIT, and the development of networked e-mail was led by BBN; both efforts involved numerous collaborators. Even ARPA was directly involved, with ARPANET director Larry Roberts writing one of the first TECO macros to parse and display e-mail message files—an example of ARPA not only funding research but also engaging in technology development itself.
The first public demonstration of ARPANET came in October 1972, about 3 years after nodes were installed at UCLA, SRI, the University of California, Santa Barbara, and the University of Utah. In 1975, ARPA handed over ARPANET operation to the Defense Communications Agency, now called the Defense Information Systems Agency (DISA). However, BBN continued to handle the key technical operation and ran the network operation center in Cambridge, Massachusetts.
From 1973 to 1974, the initial network design for the Internet began as a collaboration between Bob Kahn, who was then at ARPA, and Cerf, then a professor at Stanford University. Together they designed the TCP protocol (later, the TCP/IP Internet network protocols), a core set of protocols still used to communicate across the Internet.1 Multiple academic, industry, and government players interacted frequently to develop the Internet’s foundation. Cerf was appointed to the International Packet Network Working Group, spawned during the 1972 ARPANET demonstration. The research and development company Xerox PARC, located close to his lab at Stanford, sent its researchers to attend Stanford seminars on Internet design, contributing their experience with the PARC Universal Packet and Ethernet, two important communications technologies. In the mid-1970s and early 1980s, Cerf said, numerous academic and government researchers were using TCP/IP. With ARPA’s encouragement, industry players, including some at IBM research, HP research, and the Digital Equipment Corporation (DEC) Systems Research Center, implemented TCP/IP in a research context. Despite having their own proprietary networking protocols, the companies’ research teams were interested in and excited about a nonproprietary
1 V.G. Cerf and R.E. Kahn, 1974, A protocol for packet network intercommunication, IEEE Transactions on Communications 22(5):637-648.
global network. Because these companies implemented TCP/IP in their operating systems, it was possible by January 1983 to ask that all the computers on the ARPANET and packet satellite and packet radio networks convert to TCP/IP, a step that would later play an important role in the commercialization of the Internet.2
As the Internet gained steam, the U.S. government helped to fuel its momentum with both formal and informal collaborations. Seeing the value of quick information exchange and shared computing power, the Department of Energy (DOE), NASA, the National Science Foundation (NSF), and DARPA all implemented their own networks: the DOE Energy Science Network, the NASA Science Internet, NSF’s CSNET, and later NSFNET, and at DARPA, the packet satellite net, the packet radio net, and ARPANET. These networks were aggregated to form the Internet.
“Government representatives themselves also collaborated very directly with regard to program planning and financing [of the Internet],” explained Cerf. “They formed something called the Federal Research Internet Coordinating Committee, which was mostly represented by program managers from DOE, NASA, NSF, and DARPA. Eventually that became formalized as the Federal Networking Council, which had representatives from many other parts of the U.S. government in addition to the four initial funding agencies.”
In the early 1980s commercial organizations began to recognize the potential profits in providing equipment to support the Internet. For example, 3COM, a spin-off from Xerox PARC, made commercial Ethernet devices and eventually software that ran TCP/IP. Proteon was spun off from MIT, Cisco Systems from Stanford University, and Bridge and SUN Microsystems from Stanford and the University of California, Berkeley.
In an important shift toward the late 1980s, companies began to move past the physical equipment and began offering Internet services. One step toward commercialization resulted from a collaboration between MCI, which was participating in the NSFNET backbone, and Bob Kahn’s nonprofit organization, Corporation for National Research Initiatives The MCI commercial mail service was connected to the NSFNET backbone (with permission from the Federal Networking Council), even though commercial traffic was normally prohibited in the NSFNET appropriate use policy. Around the same time, three commercial Internet service providers emerged: UUNET, PSINET, and CERFNET. These were interconnected over a commercial Internet exchange that mirrored the federal Internet exchanges connecting networks at DOE, NASA, NSF, and DARPA.
2 Reviewers of this report noted the important contribution of Berkeley Unix, an academic project that enhanced AT&T Unix in many ways, including by adding TCP/IP networking support. AT&T allowed it to be distributed widely to academia, which spread the use of TCP/IP. TCP/IP adoption was also encouraged by DARPA, through its funding of the SUN workstation, which ran Berkeley Unix.
A hearing before Tennessee senator Al Gore in September of 1986 marked an important milestone in the Internet’s expansion. During this hearing, Senator Gore asked whether the supercomputer centers that NSF was funding should be interconnected with an optical fiber network; subsequently, NSF commissioned the design and the implementation of a backbone network for its National Research and Education Network Program while also providing subsidies for the creation of intermediate-level networks. In 1991, the passage of the High Performance Computing Act essentially established a broad initiative to create a national information infrastructure.
The next crucial milestone came in 1992, when the Boucher bill formally permitted commercial traffic on the NSFNET backbone. This collaboration between Congress, NSF, and the rest of the Internet community was a key step in the Internet development because it made it feasible to commercialize the service so that the general public could use it. Reflecting on this history, Cerf explained that many of the developments that led to the creation and expansion of the Internet would never have happened without the strong linkages between government-sponsored research programs at the universities and spinoffs that could commercialize the technology.
The World Wide Web, a term coined by Tim Berners-Lee, represented the next major step toward the Internet we know today. The first components needed to make the World Wide Web were developed by Berners-Lee at CERN in 1991: the HTTP protocols, a client and server mechanism, and browser and HTML specifications. As the 1990s progressed, a number of browsers were developed, including Erwise, ViolaWWW, MidasWWW, tkWWW, Cello, Spyglass, Internet Explorer, and Firefox.
A defining moment came in 1993 when the NSF-funded National Center for Supercomputer Applications announced the first widely used graphical browser, MOSAIC, the result of an effort pushed forward by Mark Andreessen and Eric Bina despite not being a specifically sanctioned project. With the release of MOSAIC it became clear to users that the Internet could be more than a UNIX command line and could include imagery, formatted text, color, and—eventually—video and audio. The browser quickly gained popularity: “MOSAIC represented a massive transformation for the way the Internet was perceived,” said Cerf.
Just a year later, Jim Clark and Marc Andreessen co-founded Netscape Communications, a company whose 1995 stock market launch would trigger the dot-com boom. As with previous Internet developments, this milestone reflected a mix of academic and industry contributions, but this time, with the involvement of the stock market. The dot-com boom continued unabated until April of 2000.
Cerf emphasized that none of these important early Internet developments would have been possible without the U.S. legislature funding the research agencies involved. “The lessons I take away from this in terms of the power of collaboration is that there was a very effective synergy realized between the government research agencies, which were persistent in their funding and willingness to take risk,” he said. “There was no guarantee that this program or this project would actually materialize successfully.”
During the Internet’s collaborative development, numerous institutions were created to fill important needs, a process that lent the Internet its robustness and sustainability. “When we run into an issue that requires attention that seems to require institutionalization, the Internet community simply invents these things,” said Cerf.
All the collaborators played key roles in the development and spread of the Internet. The academic community invented and explored networking technologies in a nonproprietary fashion. Industry set about to commercialize the technology, making it widely available. The stock market was equally important because it brought about the rapid expansion of this capital-intensive business. Finally, the U.S. legislature provided the funding and the very light regulation of the Internet environment that enabled it to grow.
The positive reinforcement cycles brought by these groups continue to this day. “The ball bounces between the legislative side, the academic side, and the industry side, and these cycles are all mutually reinforcing, and they continue to increase the availability of the Internet everywhere,” said Cerf.
As remarkable as the emergence of the Internet was, its early development was constrained to the realm of full-fledged computers. Developments since the early 2000s have cast aside these constraints and ushered in a new era in which practically anything can be connected to the Internet, from phones and watches to thermostats and lightbulbs. This framework in which people, data, processes, and objects are all interconnected through the Internet is known as “the Internet of Everything” (or, when referring primarily to devices, “the Internet of Things”). David Culler of the University of California, Berkeley, discussed research that is helping make possible a world of ubiquitous Internet-enabled devices.
Before delving into the technological evolutions that have enabled the Internet of Everything, Culler began with a review of key consumer products illustrating the expansion of the Internet outside the (computer) box. In the late 1990s, the handheld Palm
Pilot appeared on the market, Wi-Fi began allowing people to connect to the Internet on their laptops, and Qualcomm produced a small mobile phone. By the mid-2000s the first Apple iPhone brought the Internet to the mobile person and Nintendo’s Wii video game console introduced sensing to computer games. The arrival of the “connected home,” in which home appliances and accessories can be controlled from the Internet, was marked by the introduction of connected lightbulbs in late 2012 and the Nest thermostat in 2013. This trend continued, with the 2014 International Consumer Electronics Show revealing a shift away from tablets and smartphones to connected home devices and wearables.
Culler then described how the technology necessary to connect devices such as home security systems, lightbulbs, and game controllers to the Internet stemmed from a mixture of academic, government, and industry research and development initiatives.
In particular, DARPA funding that started in 1978 and increased in the mid- to late 1990s helped encourage the development of some key underlying networking technologies to enable interconnected devices. DARPA’s Sensor Information Technology (SensIT) program, for example, was created to develop software for networks of distributed microsensors for military purposes; the program led to ad hoc deployable microsensors and distributed computing methods to accurately extract timely information for detecting, classifying, and tracking a target from a sensor field.3 Later, the DARPA Network Embedded Systems Technology (NEST) program arranged for Culler’s group at the University of California, Berkeley, to create the building blocks for network-embedded systems. The program, launched in 2001, led to a microplatform known as TinyOS, which was scalable to extremely small devices in extremely large numbers and able to operate at extremely low power while remaining vigilant to potential stimuli.
NEST was designed so that all other contractors could use the platform and contribute to it, creating a rare nationwide open source hardware and software effort. At the end of the project, NEST demonstrated thousands of nodes with contractors in various locations. Although NEST was focused on military applications, these sensor networks quickly found use in many other applications. For example, the NSF-funded Center for Embedded Network Sensing at UCLA focused on applying sensor networks for environmental monitoring, tracking changes in habitats, and for other scientific applications.
In parallel with these developments, industry and academic researchers were pushing forward on numerous other technologies that would ultimately converge to enable a plethora of Internet-connected devices. The development of Linux, a free operating
3 S. Kumar and D. Shepherd, 2001, SensIT: Sensor information technology for the warfighter, in Proceedings of the 4th International Conference on Information Fusion, FUSION 2001,http://bit.csc.lsu.edu/~iyengar/images/contributions/TuC11.pdf.
system that could run on any computer, was an important development in the 1990s. The 1990s also brought the development of smaller microprocessors and sensors and the proposal that a sensing and communication unit that at the time was about the size of a silver dollar could be made on a millimeter scale—approaching the size of dust.
Culler pointed to several academic projects from the late 1990s and early 2000s that further pushed the envelope. The Endeavour project at the University of California, Berkeley, for example, focused on making it more convenient for people to interact with information, devices, and other people.4 Although open source software seeded this community, it suffered from a lack of hardware. In 2000, to address this deficiency, Intel formed a network of university-based “lablets,” some of which worked on how to get a tremendous amount of computing into a constrained space.
In 2003, the emergence of IEEE 802.15.4 standard for low-rate wireless personal area networks was another milestone. This standard gave fundamental lower network layers a wireless personal area network that offered low-cost, low-speed, and low-power ubiquitous communication between devices. Several years later this standard and the new Internet Protocol version 6 (IPv6) came together into a new routing standard. Around 2005, NSF added momentum with its Networking Technology and Systems (NeTS) Program, a grant program soliciting proposals in four research areas: programmable wireless networks, networking of sensors, broadly defined networking, and future Internet design.5 These developments have paved the way for a burst of new devices and applications.
By 2008, technology had converged to a point where many of the challenges of connecting small, diverse devices to the Internet had been addressed. Similarly, the development of “idle listening” solved the power consumption problem by allowing devices to monitor inputs only when they sense there is something to detect. Idle listening allowed the development of the IEEE 802.15.4e wireless standard, which is incorporated into event-driven devices and also underlies the energy-efficient Ethernet. These developments were complemented by innovations in information routing and volume management such as local rerouting and the Trickle algorithm, allowing networks and devices to continually adjust to changes in available networks and the density of users to avoid flooding the network.
4 University of California, Berkeley, Electrical Engineering and Computer Science Department, 2014, “The Endeavour Expedition: Charting the Fluid Information Utility,” last modified July 22, http://endeavour.cs.berkeley.edu.
5 National Science Foundation, Program Solicitation, NSF 06-516 to replace NSF 05-505, In the Matter of: Furtherance of the President’s Management Agenda in FY 2016 from the Directorate for Computer and Information Science and Engineering, Division of Computer & Network Systems, March 6, 2006, http://www.nsf.gov/pubs/2006/nsf06516/nsf06516.htm.
The combination of these advances led to a tipping point in small-device capabilities, and subsequent years have seen an explosion in the number and diversity of Internet-enabled devices for personal, home, business, and military use. Looking forward, Culler said, technology is entering a point where ensembles can be connected. “It’s not my smart device, it’s what happens when my smart device and a handful of them walk in with me to my home with its family of smart things that are also connected and are connected to various kinds of societal infrastructure, whether that be electric utilities or transportation,” he said. Future developments would be focused on discovery, integration, physical mashups, and metadata, Culler concluded, although along with these developments would come new challenges in another key area: privacy.
The advent of wireless technologies has been crucial to our transition toward the Internet of Everything, and these technologies will undoubtedly grow more crucial as the trend continues. A presentation by Andrea Goldsmith of Stanford University examined technical challenges facing wireless networks and the key role of government-funded research in advancing solutions.
Reflecting on her 30-year career in wireless communication, Goldsmith said today is the most exciting time for this technology. In her view, a big difference between the wireless past and the wireless future lies in who—or what—is exchanging information. Whereas in the past most information exchange was initiated or mediated by people, in the future devices themselves will likely be driving much of the communication: “We’re going from a world where we used to have people using wireless to communicate with each other and access information, and now we’re moving into a world of device-to-device communication,” said Goldsmith. “That’s going to require a complete rethinking of how we build wireless systems.”
Device-to-device communication will not only lead to new systems that we can imagine from today’s vantage point, such as the next-generation cellular phone or Wi-Fi, but will also enable sensors in everything, even inside the body, she said. One potential application of these sensor networks is to develop smart homes and buildings that could, for example, lead to greater energy efficiency or detect when an elderly person suffers a fall and call for help. Goldsmith noted that health is another area where wireless technology is poised to make a huge impact. Cell phones are already changing the way medicine is done; for example, the technology already exists for someone in Africa to use a cell phone to take a photo of a blood sample and send it to a remote location for malaria detection. Goldsmith went on to explain that in-body sensors and networks also hold
tremendous potential, though these will require completely new ways of communication, perhaps using chemicals or sensors on neurons to power devices. For example, sensors around an artificial heart might detect a problem and send a wireless signal to a device that could initiate a lifesaving intervention. Neuroscience offers other exciting opportunities: the Whole Brain Initiative, for example, is starting to decipher how neurons in the brain are connected and what the signals do. Already it is possible to inject a signal into a particular part of the brain and reduce some of the symptoms of Parkinson’s disease; a better understanding of signal encoding and decoding in the brain might allow scientists to build a tiny transmitter and receiver to compensate for damage or disease.
Despite the allure of next-generation wireless technologies and the Internet of Things, however, Goldsmith described significant challenges on the horizon and the need for innovative solutions to enable the wireless future.
One big challenge facing wireless technology is the inherent limits of the radio frequency spectrum—the medium through which all wireless signals are transmitted, along with signals from television, radio, GPS, and other data. The Federal Communications Commission (FCC) grants companies licenses to use slivers of this limited physical spectrum. Based on trends in the use of smartphones, a 2010 FCC report projected an almost 275 megahertz cellular spectrum deficit by 2014—a spectrum deficit that exceeded even the amount of spectrum being used at the time (225 megahertz).6 The years 2010-2014 indeed saw exponential growth in demand for wireless data, primarily driven by video, and this growth exceeded the spectrum available in the cellular bands. However, these same years saw a growth in the availability of Wi-Fi networks, so users did not actually experience the full brunt of the cellular spectrum crunch.
While we may have weathered that storm, Goldsmith said current trends toward the Internet of Things point to a more concerning bandwidth shortage on the horizon—one that affects both the radio spectrum generally and Wi-Fi specifically. Forecasts indicate we will have on the order of 50 billion devices by 2020. Since wireless demands already exceed the spectrum available in the license band, Wi-Fi is making up for the current shortfall, but this cannot continue indefinitely. Wi-Fi also is interference limited, so when 20 billion devices are using the same unlicensed spectrum, Wi-Fi will face a major crunch as well.
While acknowledging a significant amount of hype building around the Internet of Things, Goldsmith explained that there is enough evidence of the trend’s emergence and impact for it to be taken seriously when projecting future wireless demands. In the trans-
6 B. Reed, 2010, FCC projects 275 MHz ‘spectrum deficit’ by 2014, Network World, October 21, http://www.networkworld.com/article/2192490/wireless/fcc-projects-275mhz--spectrum-deficit--by-2014.html.
portation sector, there has been growth of automated highways, and semi-automated cars are already among us; the newest Tesla car, for example, can change lanes without driver input.7 Similar trends are happening in the health care sector, where people are already using wearable sensors to track heartbeat, physical activity, and other variables. “If you just look at these two sectors as already emerging as economically viable, I think there’s no question that the Internet of Things is going to be very real,” Goldsmith said. For this reason, she said the prediction of 50 billion connected devices is not unreasonable, and even if it turns out to be only 10 billion or 20 billion, that is still much more than today’s wireless communication infrastructure can handle.
Goldsmith said there is still an open question whether the deficit in bandwidth is a result of poorly designed systems, or because the systems have reached their physical capacity (also referred to as the Shannon limit of the physical layer, or the maximum rate at which data can be sent over a particular bandwidth with zero error).8 Pointing out that the Shannon capacity of wireless channels is unknown and even less is understood about the Shannon limit of ad hoc and sensor networks, she said more research is needed in this very theoretical field to understand whether better network design could help to solve the bandwidth shortfall.
A second problem lies in the design of cellular networks. Even though cellular technology is in its fourth generation, Goldsmith explained, the underlying design principles of today’s cellular systems are identical to those of first-generation analog systems: It is still assumed, for example, that the system is interference-limited. However, multiple technological advances have emerged to address the problem of interference, including using multiple antennas, or MIMO, and multiuser detection, which was invented in the 1980s but only recently became implementable thanks to increases in computer processing power. Despite being no longer interference-limited, the overall design of the networking system has not changed to take advantage of these developments. In addition, there is a growing need for cellular networks to become more energy efficient: One unknown, for example, is the minimum amount of energy necessary for a network to operate when power is limited, such as during an event affecting the power grid.
As a result of these trends and needs, it is time for a complete rethinking of cellular design, said Goldsmith, adding that this effort needs to be driven by the research world. Only after researchers show that a new design can net an order-of-magnitude improve-
7 In October 2015, Tesla Motors announced that its new software release would incorporate additional self-driving technology (Tesla Motors, “Your Autopilot Has Arrived,” October 14, https://www.teslamotors.com/blog).
8 C.E. Shannon, 1948, A mathematical theory of communication, The Bell System Technical Journal 27:379-423, 623-656.
ment in a cellular system will it be adopted by industry. Because of industry’s focus on short-term revenue, companies cannot afford to spend the time and money on research and development to completely rethink cellular system design.
Goldsmith said that this effort will in part involve determining the most important aspects of the cellular network: Is capacity the primary concern? Or power consumption? For example, if someone is trying to connect a device powered from an energy-harvesting battery, speed may not matter as much as connecting to the cellular network with minimum energy. Coverage is another issue: Can we build a cellular system that gets coverage everywhere, including indoors?
Goldsmith pointed to millimeter wave MIMO technology as a possible solution. There is a great deal of unregulated open spectrum at 60 gigahertz or higher. However, operating at these frequencies comes with challenges. Antenna arrays containing hundreds of elements can compensate for the high attenuation, but this approach will require a new design approach. “In my view, and we’re doing some research on this, we really need a complete rethinking of system design to take advantage of these technologies,” she said.
Another challenge is how to use Bluetooth, Wi-Fi, cellular, and even other networks in a seamless way. “What I really want is a big wireless cloud,” Goldsmith said. “I don’t care what network I’m on, I don’t need the icon on my phone telling me what wireless network I’m on, I just want it to work for whatever application I’m using.”
In wired networks, a big wave of research has focused on software-defined networking, an approach that might be usable for wireless networks too. However, wireless networks are fragmented, so switching from cellular to Wi-Fi typically requires closing a session on one network and opening another. Goldsmith envisions a potential software-defined networking design for wireless devices that uses a unified control plane to match the wireless network to the application being used. For example, a low-data-rate, low-energy application might use millimeter wave, Bluetooth, or lower power Wi-Fi instead of cellular.
Energy is the driving constraint for sensor networks, explained Goldsmith. Some sensors harvest energy from the environment while others are powered by batteries. Some battery-powered sensors, such as those embedded in a structure like a bridge, must last decades without recharging. To build communication systems that use extremely low amounts of energy, Goldsmith said, we need to start from scratch. Modulation, coding, and multiple antenna techniques are all power hungry, not only in terms of transmitting energy but also in the processing power. For short-range networks, it is important to examine how much energy the circuitry consumes. Zigbee and Bluetooth
may be a lot better than Wi-Fi or cellular, but it is not known if they are anywhere close to the minimum energy consumption possible, said Goldsmith, pointing to the need for more research in this area.
To conclude, Goldsmith discussed how theoretical research translates into practice and how practice can also circle back and inform research. As a case in point, Goldsmith shared the story of her first start-up, Quantenna, which she launched after about 20 years as a researcher. At the heart of this effort was her desire to build something—a desire she traced back to her first job building an antenna array in the mid-1980s, an experience she said made her fall in love with wireless communication and inspired her research career. Quantenna makes Wi-Fi chips with the goal of achieving the best performance on the market, based on Goldsmith’s research in communications theory, and the company recently announced a 10 gigabytes per second Wi-Fi system that uses the most sophisticated physical layer in existence. Goldsmith cites this achievement as an example of applying deep theoretical research to build better systems. At the startup, she said she learned that many aspects of wireless systems are poorly understood and that actually building a system revealed many questions that later fed back into her research and teaching.
She concluded her talk by pointing out that much research is still needed to realize a wireless vision, but that doing this work will allow wireless technology to change people’s lives worldwide. She also said that although she thinks that research has a profound impact on technology development and vice versa, a stronger connection or feedback loop from industry to universities would offer more synergy and allow researchers in universities to solve even more important problems—and government has an important role in making this happen. “Government and government-funded research were key for the development of wireless technologies. These technologies are central to the growth and success of mobile devices, but there is still more that needs to be done to get us where we want to go,” she said.