Covid vaccine development spurred by this World War II decision

Registered nurse at Rocky Mountain Regional VA Medical Center, Patricia Stamper, analyzes a dose of the Pfizer-BioNTech COVID-19 vaccine before administering it to a healthcare professional at the hospital on December 16, 2020 in Aurora, Colorado.

Michael Ciaglo | Getty Images

The rapid development of vaccines for Covid has sparked some debate about who deserves more credit: the government with its Operation Warp Speed, pharmaceutical companies or university researchers who were the pioneers in discoveries about messenger RNA.

The best answer, I think, is that vaccine development, like most other major American innovations in the past 75 years, was largely due to a unique decision made after World War II to tightly interweave the roles played by the government and by private industry and academia.

This triple helix was designed by the influential science administrator Vannevar Bush, who had one foot in all three fields. He was dean of engineering at MIT, one of the founders of Raytheon and then the chief government science administrator during World War II, overseeing, among other projects, the construction of the atomic bomb.

In a 1945 report to President Truman with the quintessentially American title, “Science, The Endless Frontier”, Bush recommended that the government should not build its own large research labs, as it had done for the atomic bomb project, but should fund research in universities and corporate labs.

“No American had more influence on the growth of science and technology than Vannevar Bush,” MIT President Jerome Wiesner later proclaimed, adding that his “most significant innovation was the plan by which, instead of building large government laboratories, contracts were made with universities and industrial laboratories. “

Much of the government’s postwar scientific funding went to basic research, driven by curiosity, which still had no known practical applications, such as how quantum mechanics can explain what happens on the surface of semiconductor materials or as fragments of RNA act as messengers to build proteins. Bush knew that the discoveries in basic science would be corn seeds that would eventually turn into unforeseen inventions, like transistors or mRNA vaccines.

The government-academic-corporate helix that Vannevar Bush devised has given rise to melting pots of innovation around major research universities.

This government-academic-corporate partnership produced the major innovations that propelled the United States economy in the post-war period, including microchips, computers, graphical user interfaces, GPS, lasers, the internet and search engines. Google, for example, was started by Larry Page and Sergey Brin as an academic project at Stanford partially funded by the National Science Foundation.

Over the years, an imperfect but productive system has been patched to divide income and intellectual property. In 1980, for example, Congress passed the Bayh-Dole Act, which made it easier for universities to benefit from patents, even if the research was funded by the government.

One of the most important innovations of our era will be the gene editing technology known as CRISPR. One of its inventors is Jennifer Doudna, a professor at Berkeley, who won this year’s Nobel Prize and is fighting a long patent battle with Feng Zhang of the Broad Institute at MIT and Harvard.

They and their institutions are good examples of the government-academy-business relationship. Their academic research was funded in part by grants from the National Institutes of Health and the Defense Advanced Research Projects Agency, and both started private companies to commercialize their CRISPR findings for gene editing, disease diagnosis and now coronavirus detection.

This process also resulted in Covid’s vaccines. Over the years, NIH and DARPA have funded university research on how DNA and RNA work. For example, in 2005, a pair of researchers at the University of Pennsylvania, Katalin Kariko and Drew Weissman, showed how to adjust a messenger RNA molecule so that it could enter human cells without being attacked by the body’s immune system.

Two entrepreneurial start-ups

Shortly thereafter, two start-ups were founded to commercialize medical uses for this mRNA: BioNTech in Germany and Moderna in Cambridge, Massachusetts. When the Covid pandemic happened, they created ways to use mRNA to instruct human cells to make parts of a peak protein that would boost immunity to coronavirus. They were aided by guaranteed purchase agreements and logistical support from the government’s Operation Warp Speed.

The government-academic-corporate helix that Bush envisioned has given rise to melting pots of innovation around major research universities. Silicon Valley began to grow around Stanford in the 1950s when its dean, Frederick Terman, began to encourage professors and graduate students to commercialize their findings, which led to the birth of companies like Hewlett-Packard, Cisco , Sun and Google.

Kendall Square in Cambridge is the new Silicon Valley. Located close to MIT and close to Harvard, it houses centers for more than 120 biotechnology companies within a mile of each other, including Moderna, Pfizer, Merck, Novartis and Sanofi.

And more and more, this model of big universities encouraging the commercialization of their government-supported research is leading to other thriving centers of innovation across the country, from Austin and Houston to Raleigh-Durham and Seattle, to Nashville and New Orleans.

Walter Isaacson is the author of “The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race”, to be published by Simon and Schuster on March 9.

.Source