Innovation in an Age of Global Science | Commentary

Posted March 16, 2015 at 12:30pm

Scientific research is dramatically more global in its practice and impact than it was just a decade ago. Whether the United States is able to capitalize effectively on new discoveries stemming from international collaborations will determine future economic growth and job creation in America.

High-energy physics, or particle physics as it is often called, is the epitome of the growing globalization of science and its recent history holds lessons for the innovation possibilities of tomorrow.

Not long ago, the United States was home to three world-class accelerators, mega machines that are the mainstays of particle physics. Now all three — Brookhaven’s AGS, Stanford’s Linear Collider and Fermilab’s Tevatron — are mere memories. What remains is the heritage of the pioneering technologies they developed that are vital to accelerator facilities throughout the world.

Today, many American particle physicists are members of international teams working at such facilities, principally the Large Hadron Collider, a mammoth circular device stretching 17 miles in length and buried as far as 574 feet below ground near Geneva, Switzerland. Dan Brown fictionalized the machine half a dozen years ago in “Angels and Demons,” a best-selling tale he spun around the creation of a single gram of anti-matter.

The LHC, as physicists around the world call it, never did what Brown imagined it could, but it did produce the Higgs boson with the help of 1,500 U.S. scientists. In physics circles, the Higgs discovery represented a dazzling conclusion to decades of work by thousands of scientists from more than 30 countries. But along the way, CERN, the European Organization for Nuclear Research, which hosts the LHC, spawned something even more revolutionary, at least in a public sense.

Spread across the globe, CERN’s scientific cast desperately needed a communication tool that could transmit images and massive sets of data. In 1989, Tim Berners-Lee, working in Geneva, delivered the tool: the World Wide Web. And in fewer than 20 years, the Hypertext Transfer Protocol, HTTP, transformed commerce, entertainment, finance and the way we connect with each other.

Clever people in any nation could have seized on Berners-Lee’s creation and made their fortunes. But the dot-com revolution happened here. And it was no accident.

We had a well-honed entrepreneurial engine: brainy students at top-flight universities, science agencies with a history of supporting the best peer-reviewed research, a legacy of federally funded science discoveries ripe for exploitation, risk taking venture capitalists with deep pockets and well-established laws that protected intellectual property. What ensued is nothing short of one of the most spectacular episodes in the American dream.

The High-Performance Computing and Communications Initiative — credit Al Gore — enabled Marc Andreessen and Eric Bina, then students at the University of Illinois at Urbana-Champaign, to develop Mosaic. That graphical browser, introduced in 1993, morphed into Netscape Navigator and eventually into today’s three most popular tools, Internet Explorer, Mozilla and Safari.

In 1994, two Stanford electrical engineering graduate students, Jerry Yang and David Filo, developed the portal “Jerry and David’s Guide to the World Wide Web.” Today, we know it as Yahoo.

A few years later, Larry Page, who was pursuing his Ph.D. at Stanford, teamed up with fellow student Sergey Brin, who was drawing support from the National Science Foundation, and began work on a Web crawler. Their venture is better known as Google.

In 1998, Peter Thiel and Max Levchin launched PayPal, the electronic money transmission service. Reid Hoffman, who was one of its board members, co-founded LinkedIn in 2002. That same year, Mark Zuckerberg and four of his Harvard classmates launched Facebook. Social media was here to stay.

All that was missing was hardware to make the connectivity portable. In 2007, Steve Jobs turned hope into reality when he unveiled the iPhone. Jobs was a visionary and an extraordinary salesman. But neither he nor his company, Apple, was a technology originator. The iPhone and the iPad, which followed in 2010, were enabled by decades of research funded by the U.S. government.

The World Wide Web was the product of international science. Any nation could have capitalized on it, but the United States had the capability to run with it best and fastest. Today, science is even more global. Whether we will be able to replicate our past success when the next big thing happens somewhere else in the world depends on the preparations we make now: ramping up support of public universities, reinvigorating federal research budgets and getting venture capitalists to return to the days when risk was not a dirty word.

Michael S. Lubell is the Mark W. Zemansky Professor of Physics at the City College of the City University of New York and director of public affairs of the American Physical Society. He writes and speaks widely about scientific research and science policy.