Newsletter / Issue No. 31

Image by Ian Lyman/Midjourney.

Newsletter Archive

16 Apr, 2025
navigation btn

Listen to Our Podcast

Dear Aventine Readers, 

Nuclear fusion is a technology that — if developed successfully and at scale — could immediately solve a lot of humanity's problems by creating almost limitless clean energy. For decades, it was a pipe dream. A recent breakthrough changed that, and now companies are scrambling in hopes of being on the forefront of this new transformative technology. The challenge? The cost and risk required are right up there with space exploration and geoengineering. In this issue we look at how companies are managing that dicey combination as they pursue this energy holy grail.  

Also in this issue: 

  • A new hybrid system that combines clean energy with gas-fired peaker plants is being considered to power data centers. 
  • Anthropic reveals just how little understanding we have of how large language models work.
  • And large mammals like bison and wildebeests could help extract CO₂ from the atmosphere. 
  • Thanks for reading!

    Danielle Mattoon 
    Executive Director, Aventine

    Subscribe

    Subscribe to our newsletter and be kept up to date on upcoming Aventine projects

    The promise of nuclear fusion is simple but profound: If properly harnessed, it could provide the world with practically limitless clean energy. Yet the process of building and commercializing the technology is long, difficult, uncertain and expensive.

    For decades, nuclear fusion was little more than a pipe dream. In fact, there’s a long-standing industry in-joke that “fusion is always 30 years away.” But the mood has changed in recent years. That is in large part due to a turning point, in December 2022, when the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory in Livermore, California, announced that it had achieved fusion ignition for the first time — meaning that it started a fusion reaction that produced more energy than the energy used to initiate it. 

    This was a major achievement, showing for the first time that it might be possible to build a viable fusion power plant. The accomplishment buoyed the ambitions of a growing number of companies which are now jostling to commercialize the underlying science in order to build the first nuclear fusion power plants. Yet as big-swing technical ambitions go, nuclear fusion is in a class almost by itself. Only a handful of other technological endeavors — space exploration and geoengineering come to mind — require such enormous initial investments with such long-term uncertainty. It’s a dynamic that raises profound questions for companies trying to fund and develop the technology. How do you raise enough money to do this sort of work? If you can’t raise all that money in one go, how can you approach the challenge and still have a shot at solving the problem? If things don’t go as planned, are there ways to ensure that your work can provide some utility for the wider industry and recoup the losses of your investors? Aventine spoke with experts in academia and industry to understand how insiders are thinking about these problems and the path toward commercializing nuclear fusion.

    The high cost of R&D

    At the heart of nuclear fusion is a simple reaction — the same one that powers the Sun. To replicate on Earth what happens naturally in the Sun requires taking positively charged atomic nuclei — typically isotopes of hydrogen — which naturally repel each other, and forcing them together in an environment of over 100 million degrees Celsius. At these temperatures, atoms are broken apart into nuclei and electrons — the positive and negative parts of an atom — forming a plasma that must be tightly confined in order to sustain the reaction for long enough to create meaningful quantities of energy.

    Creating that reaction currently requires an enormous amount of power. In the tests conducted at the NIF, for example, about 500 trillion watts of energy are required for a few billionths of second to create energy that exceeds the initial input. (To put 500 trillion watts in perspective, it’s about 500 times the entire total installed power generation capacity of the U.S.) But in theory fusion doesn’t necessarily require high energy input, said Dennis Whyte, a co-founder of Commonwealth Fusion Systems, based in Devens, Massachusetts, and a faculty member at MIT. Rather, it can be thought of as an energy amplifier. "Think of it like a fire," Whyte wrote in an email to Aventine. "The initial energy is the chemical energy released by the match, which is necessary to get the fire going, but the fire amplifies this energy through burning the wood." If the process can be made more efficient, then — because there’s a practically infinite amount of hydrogen isotopes (the wood of the fusion reaction) on the planet — harnessing that reaction at scale could yield virtually limitless energy.

    “You could actually make more energy than humanity will ever need for the first time. The geopolitics, the energy inequity, the other [problematic] parts of energy could get solved,” said Brian Berzin, co-founder and CEO of Thea Energy, a fusion company based in Kearny, New Jersey. “That's pretty big.”

    There are currently two dominant approaches to setting up and maintaining a fusion reaction. One is to use powerful magnetic fields to contain the heated plasma into a ring that runs within a donut-shaped vessel. The other is to aim high-intensity electromagnetic or laser beams at a tiny fuel pellet from all sides, simultaneously igniting it and causing it to compress with such force that the combination of the two actions creates a confined fusion reaction. (This is the approach used at the NIF). While each approach has unique challenges, they are both profoundly complex. “It's just hard,” said Edward Moses, CEO of the Orinda, California-based fusion startup Longview Fusion Energy Systems and a former director of NIF. 

    The research and development investment required to build these systems is enormous. The International Thermonuclear Experimental Reactor (ITER) — a collaboration among the EU, China, India, Japan, South Korea, Russia and the United States — will be the largest fusion demonstrator ever built when it goes online in the early 2030s, and it is expected to have cost at least $20 billion by the time it is built. For context, CERN’s Large Hadron Collider cost $4.75 billion to design and build.

    Companies seeking to bring this technology to market must design entirely new systems that can efficiently inject enough power to kickstart a fusion reaction. They must build complex systems that use exotic magnets or pressure fields to confine the resulting plasma. They must develop new materials that can withstand the violent conditions inside these devices, develop the most effective mixtures of fuels to create efficient reactions, and find sensible ways to extract the energy from the reaction, among other significant technological advances required to make fusion a reality. In many cases, companies must also build demonstrator systems that could cost tens or hundred of millions of dollars in order to prove whether their specific approach to fusion will work.

    “If you succeed, the prize on the other side is just amazing. I mean, you just get energy forever,” said Carrie Von Muench, COO at Pacific Fusion, a fusion company based in Fremont, California. “But it takes a lot of capital to get there.”

    The funding challenge — how much is enough?

    Much of the fundamental groundwork for fusion has emerged from public programs inside national or university laboratories, said Whyte. But the U.S. in particular, he said, is “a place that is highly driven by a competitive, commercial landscape” where there is a sense that commercialization is best done through the private sector. That stands in contrast to, say, China, he said, where fusion projects are heavily funded by the government.

    According to the fusion industry research firm FusionX, more than $12 billion of funding has now been poured into private sector nuclear fusion efforts globally. The way companies are making use of that capital varies dramatically, as different players tackle very different technological challenges and fight to secure their own slice of funding.

    There are a handful of highly capitalized startups that have raised significant amounts in funding rounds to support their R&D. Commonwealth is perhaps the poster child of this approach, having raised more than $2 billion, making it the best capitalized company working in nuclear fusion. Helion Energy, based in Everett, Washington, has raised over $1 billion, and Pacific Fusion burst onto the scene last fall with the announcement of a $900 million funding round in which the capital will be unlocked over time as the company achieves a set of predetermined milestones.

    Raising such huge sums is difficult: Few investors are willing to put up that much capital for projects with long lead times and high chances of failure. But for those who secure such funding, the approach provides “maximum control” over timelines and resources, said Whyte. It also provides freedom. Von Muench of Pacific Fusion pointed out that the $900 million in early funding enabled it to hire a high-quality team quickly, helping create “a positive flywheel of executing really quickly early in a company's lifetime.”

    These sorts of investments are the exception rather than the rule, though, and most fusion startups take a more traditional approach to raising funds, working with tens of millions of dollars at a time to make progress and demonstrate that a part of their technology works before returning to the markets to raise more money to fund a next wave of progress. Berzin, the CEO of Thea Energy, helped the company raise a $20 million Series A round, which he said funded the company through a set of milestones that enabled it to demonstrate that its scientific concepts underlying new types of magnets could be realized by building prototype hardware. The company now plans to raise more money to scale up the manufacturing of this hardware. 

    Whyte has proposed an alternative approach to funding nuclear fusion, borrowing from ideas tested in the field of biotechnology. He suggests creating a nuclear fusion megafund, which would bring together capital and debt from a range of public and private investors, allowing them to invest across a diversified portfolio of fusion companies with the stipulation that some investors, such as governments, would lose their investments first if the fund underperforms. This would help de-risk the sector for private investors, encourage more of them to put up money, and ultimately generate a much larger pool of funds for fusion companies. For now, though, the idea remains a proposal — nobody has yet built such a fund.

    New strategies and side hustles

    The current, more iterative approach to fundraising means that startups are employing very different strategies to make their work possible. 

    Longview Fusion, for instance, is hoping that by working closely with federal labs like the NIF it can reduce some of its R&D costs. Gauss Fusion, based in Europe with offices in Germany, France and soon Italy, is using a partnership model to bring together technology companies from across Europe to distribute its workload and keep head count low. Avalanche Energy, based in Tukwila, Washington, is looking to build fusion power systems no larger than a shipping container, betting that smaller devices will be cheaper and easier to develop, test and build. 

    Some companies, such as SHINE Technologies, in Janesville, Wisconsin, and TAE Technologies in Foothill Ranch, California, are commercializing some of the technology they are developing for nuclear fusion to generate revenue sooner. In this case, both companies are using technologies similar to those used in nuclear fusion to produce medical isotopes that can be used in diagnosis and treatment. TAE has also repurposed special power systems, initially designed to store and release the power needed for fusion reactions, as a new form of electric vehicle charging system.  Michl Binderbauer, CEO of TAE Technologies, explained that while spinning out elements of a technology in this way provides an extra revenue stream, it could also help streamline processes in the future by helping develop a supply chain.

    Whyte said he’s “thrilled” to see this sort of activity. As technologies become commercialized, he said, they become available for the broader fusion ecosystem to make use of. That enables other companies to skip stages of R&D altogether and make faster progress.

    Know when to pivot

    For others, the capital demands of the ecosystem mean it might not even be worth building a fusion system of their own. First Light Fusion, based in Oxfordshire, U.K., for instance, shifted strategy earlier this year. Rather than building an entire fusion system, which has been its plan since being founded in 2011, it has pivoted to becoming the provider of specially designed fuel pellets for other fusion companies, which it says will enable them to start fusion reactions far more efficiently. 

    “We could see the money-raising that was being done around some of those larger organizations — Pacific Fusion and Commonwealth being the two examples — and we weren’t going to be able to compete with that,” said Mark Thomas, CEO of First Light. Opting not to sink capital into building a full nuclear fusion system and instead focusing on a specific element of the supply chain, he added, was “obviously very helpful in terms of earlier revenue generation, enabling us to get to break even faster, having to raise less money.”

    This isn’t failure but “a natural part of trying to develop a new industry,” said Whyte, who thinks that the industry will see this sort of thing happening more frequently in the future. Thomas agrees: “I think we'll see [this] becoming a trend, that people will look for more commercial outlets for their capabilities.”

    In fact, companies like First Light may find that they outperform some of their peers by dodging many of the high costs associated with building full-scale nuclear fusion demonstrators. He cited the example of Nvidia: Investors may well stand to make far more money from owning shares in the chip maker powering the rise of AI than by investing in any one AI startup, which could easily fail. “While we tend to focus now on the companies that are trying to develop integrated fusion concepts, it’s likely that some of the greatest economic returns and investment opportunities are not in those companies,” said Whyte.

    Listen To Our Podcast

    Learn about the past, present and future of artificial intelligence on our latest podcast, Humans vs Machines with Gary Marcus.

    Quantum Leaps

    Advances That Matter

    Wildebeest in Kenya Photo by Michael Kyule /Unsplash

    Rewilding could help remove carbon out of the atmosphere. Big mammals like bison and wildebeests don’t spring to mind as natural tools for tackling climate change, but research suggests strategic reintroduction of certain species into areas where they have been wiped out could help us draw hundreds of millions of tonnes of CO₂ out of the atmosphere. New Scientist reports on an emerging field of study focused on how large animals shape entire ecosystems, such that even carnivorous predators can have a large impact on the way a particular habitat absorbs carbon dioxide. As an example, Africa’s Serengeti is home to 1.2 million blue wildebeests. Their trampling of grasses reduces wildfire risk, while their dung stores CO₂, which is further locked out of circulation when insects bury it in the soil. As a result the area stores 4.4 million tonnes more CO₂ every year than it would in the absence of those wildebeests. An experiment in the Tarcu Mountains of Romania reintroduced around 100 European bison to the area in 2014. Their number has risen to 170, and the space they live in is now estimated to soak up 10 times more CO₂ than before. Modeling suggests that strategic introduction of these and seven other species — including forest elephants, American bison and several kinds of whales — could have a similar impact in specific ecosystems around the globe, potentially helping soak up more than 600 million tonnes of carbon a year. To be clear, this does not suggest that all rewilding is a net benefit for the climate, and the numbers involved are not insignificant: The carbon savings above, for instance, would require a quintupling of African forest elephant populations and a more than 60 times growth in population of American bison. Critics also argue that although the approach works in one ecosystem, the advantage may not map directly onto others, so more work is required to understand the true impact. But in a world where rewilding as a concept is gaining some traction, it appears that a targeted nurturing of specific animal populations could also help us fight climate change.

    An emerging solution to powering data centers. As the energy needs of artificial intelligence continue to soar and clean energy solutions remain desirable, a new hybrid energy solution is being contemplated for data centers: facilities that combine newly built renewable systems and batteries with existing natural gas-fired “peaker” plants. In this setup, reported by Canary Media and described in a paper by the energy think tank RMI, the data center would largely rely on renewable energy and draw power from the peaker plant only when solar, wind and batteries couldn’t deliver electricity. The renewables would piggyback on existing grid connections used by the peaker plant, meaning that during times of oversupply they could also contribute electricity to the grid. The hitch is that the power generated by the proposed approach, according to RMI, would cost as much as $100 or $200 per megawatt-hour, far more than the $30-$40 for regular wholesale clean energy. That said, Microsoft is rumored to have agreed to pay in excess of $100 per megawatt-hour to buy power from the soon-to-be reopened Three Mile Island nuclear power plant, suggesting that in times of desperation, tech companies may be willing to pay top dollar for clean power if it enables them to continue growing their data center footprint.

    Anthropic shines a light on the inner workings of LLMs. Anthropic, the $60 billion artificial intelligence startup, has been studying the process by which large language models use to perform tasks such as addition, writing poetry, reasoning, and seven other activities and behaviors. Their approach identifies clusters of activity inside an LLM that are activated when it is prompted to tackle a particular concept. For example, when the model deals with words like “small” or “tiny” or even small objects such as thimbles, a region associated with “smallness” will activate, Anthropic told MIT Technology Review. The results reveal some strange inner workings. Take, for instance, the addition of 36 + 59. The model did two things when asked that question: It added together rough estimates of the numbers and landed on the number 92ish; at the same time, it added together the final two digits of the numbers to ascertain that the solution must end in a 5. Combining the two approaches, it gave the (correct) answer of 95. Yet when asked directly how it achieved this solution, the LLM said that it deconstructed the problem into 1s and 10s and performed basic addition, including carrying numbers from the ones into the tens column. This, as it happens, is the way addition is explained in the reams of data the LLM was trained on, and the fact that the LLM performed calculations in one way and explained its calculations in another raises obvious concerns about its trustworthiness. In another example, the model was prompted to write the second line of a rhyming couplet based on the first, and was seen to identify candidate words that made a rhyme possible before deciding how to write the line. This behavior runs counter to the common understanding that LLMs create language by predicting the most likely next word, suggesting that the mechanisms through which these models generate text may be more complex than we realize. And that is perhaps the biggest point to make here: Even though Anthropic was able to tease out glimpses of how a model works in a way that hasn’t been possible before, we are still a very long way from understanding why they behave in the way they do.

    Long Reads

    Magazine and Journal Articles Worthy of Your Time

    Inside Google’s Two-Year Frenzy to Catch Up With OpenAI, from Wired
    5,500 words, or about 22 minutes

    In December 2022, OpenAI caught almost everyone off guard by releasing ChatGPT. Surprisingly, that included Google, perhaps until then the dominant force in AI development and as it happens, the birthplace of the very technology that underpins the GPT approach in AI. This story, based on interviews with over 50 current and former Google employees, describes how the tech company responded to what it saw as an existential threat. A 100-day sprint to release its own ChatGPT competitor — a challenge laid down by senior management — helped get the company back into the mix, and its Gemini AI now jostles with products from the likes of OpenAI and Anthropic as among the most advanced LLMs in production. But in many ways, that was just the beginning of the story. OpenAI still dominates the sector, and AI now seems to dominate pretty much everything inside Google, drawing resources from other teams and finding its way into almost all of the company's other products. The demand to keep up with OpenAI has not slowed, despite the company’s staff cuts in recent years, and the story suggests that burnout and a sense of unease pervade much of the AI workforce at Google.

    Healing My Family’s Future, from Asimov Press
    4,100 words, or about 16 minutes

    In this fascinating piece that weaves together memoir and cutting-edge scientific research, Jennifer Lai Remmel, a biomolecular engineer, recounts her experience of being diagnosed with an aggressive hereditary cancer and the steps she took to save her own life and protect her child from the same fate. Lai Remmel lost her mother at an early age to gastric cancer, but only later, as she was embarking on her career as a research scientist, did she understand that the cancer may have been passed on to her. After confirming through genetic testing that she shared her mother’s vulnerability to gastric cancer, she had her stomach prophylactically removed. Though she’d suffered no symptoms, her stomach was found to have 115 sites of carcinoma. Then, in order to avoid giving birth to a child with the same disposition for stomach cancer, Lai Remmel and her partner embarked on a course of IVF that allowed them to choose an embryo without the genetic mutation she carried. As she wrote, that process raised all sorts of ethical questions: At what point does choosing an embryo start to look like selecting a designer baby? Do such interventions, even when they are well-intentioned, set us on a slippery slope in that direction? Which conditions are serious enough to warrant this approach? There are no easy answers, but as genetic testing becomes more common and interventions more accessible, these could well become societal and not personal decisions. 

    The biggest machine in science: inside the fight to build the next giant particle collider, from Nature
    3,300 words, or about 13 minutes

    The Large Hadron Collider, developed and run by the European laboratory CERN, has until now been the most powerful particle accelerator ever built, and has facilitated fundamental science that confirmed the existence of the Higgs boson, perhaps the most important particle physics discovery in many decades. But in particle physics, bigger is often better, so CERN has for years been working on plans for its successor, the Future Circular Collider. The circular accelerator would have a circumference of over 56 miles, dwarfing the 17-mile Large Hadron Collider, and would force particles to collide with eight times the energy as its smaller cousin. It could theoretically produce particles that humans have never before observed, helping us better understand the fundamental underpinnings of the universe, as well as helping investigate topics such as dark matter. Yet the project is mired in controversy: Many scientists don’t think that the new facility would be worth the potential $30 billion price tag, that it would take too long to build and divert attention from other important areas of research. At the same time, some of CERN’s member states are threatening to withhold funding for the project; a similar experimental system that may be built in China could undercut the work entirely. This story sheds light on tensions that could alter the future course of physics.

    logo

    aventine

    About UsPodcast

    contact

    380 Lafayette St.
    New York, NY 10003
    info@aventine.org

    follow

    sign up for updates

    If you would like to subscribe to our newsletter and be kept up to date on upcoming Aventine projects, please enter your email below.

    © Aventine 2021
    Privacy Policy.