Dancing with AI

How SambaNova is moving with agility to take full advantage of a massive opportunity

Even inside the AI industry, no one could have anticipated the megatsunami triggered by the release of ChatGPT on November 30, 2022. Sure, academics, entrepreneurs, technologists, investors, and others working in the sector had predicted for years that artificial intelligence would become a big deal. But suddenly the entire world was talking about it—and enterprises everywhere were scrambling to get on board.
For the executive team at AI company SambaNova Systems, this radical shift presented both opportunities and challenges. SambaNova was founded in 2017 on the belief that computer systems needed to be reimagined to more efficiently run the vast new neural networks being developed to push forward the frontiers of AI. The company and its investors had staked everything on the premise that we were in the midst of a generational transition in AI that would soon touch every industry. Its mission is to provide enterprises, regardless of sector and geography, with the capabilities to implement and control their own AI infrastructure and models. They recognized the release of ChatGPT last November as a potential galvanizing event that could greatly accelerate SambaNova’s growth trajectory—so long as the company continued to execute well in a changed environment.
Quote Icon
We tell companies, ‘You want GPT? We’ll give you your own that’s secure, private, and trained on your own data.’ ”
“The number of companies wanting to engage with us jumped four to five times,” Rodrigo Liang, SambaNova co-founder and CEO, says of post-ChatGPT life. That has put a great strain not only on the company’s salespeople, but its entire operating model, impacting the product design teams that tailor systems and the engineers who pretrain the models they create for each customer. At the same time, it intensified the competition SambaNova has faced from companies ranging from well-financed startups to established AI hardware and chip vendors like Nvidia. So how does a company in that enviable situation—product market fit on a scale no one could have anticipated—make the most of the opportunity? “As a leader, you want to make sure that everyone around you is committed to the speed of action,” says Michael Useem, a professor emeritus of management and director of the Center for Leadership and Change Management at the University of Pennsylvania’s Wharton School. But speed cannot come at the expense of making the right decisions or of prioritizing customers that are aligned with the company’s strategy and objectives. “Make sure that the collective wisdom of your leadership team is part of those decisions,” Useem adds. The SambaNova team appears to have embraced the lessons. “You need discipline,” says Marshall Choy, the company’s senior vice president of Product, “or your company quickly turns into a caricature of a Swiss army knife where you’ve got 50 things but you’re not optimized or really good for any particular thing.” Discipline also meant adjusting its key messages to prospective customers, learning to say no, leaning into the prevailing narrative, and focusing not on where AI is today but on where it’s likely to go over the next few years.

AI chip shot

What’s most striking about Palo Alto–based SambaNova is its ambition. Liang’s co-founders, Kunle Olukotun and Christopher Ré, are Stanford professors who in the mid-2010s teamed up to rethink computer architecture for an AI world. Ré was focused on machine learning and databases. Olukotun, who is best known for pioneering the multicore processor, a technology that led to major improvements in chip efficiency and speed and became ubiquitous in the early 2000s, focused on chips and systems. Rather than tweaking existing technologies to run ever-expanding neural nets, Olukotun and Ré decided they would start from scratch, imagining a new type of system built expressly for AI that would avoid bottlenecks and limitations of legacy chips and servers. Their solution was a new, full-stack computing architecture—from silicon to software—tailor-made for data-heavy AI algorithms. “Traditional chips, even those commonly used for AI applications, are not good at handling those data-heavy algorithms,” says Olukotun. “So we set out to design a computer that natively executes this paradigm of computing, which is core to machine learning models.”
AI Outlined Cubes GridLeft Hand with CubeRight Hand with CubeBlue AI CubeBig dark blue CubeSmall dark blue CubeSmall dark blue CubeOrange AI CubeOrange AI CubeRight Hand with CubeLarge Orange CubeSmall lightblue cubeSmall lightblue cubeSmall lightblue cubeSmall lightblue cubeSmall lightblue cubeSmall lightblue cubeSmall lightblue cube
Olukotun began talking with Liang in early 2017 about what he sometimes dubs “data flow computing” and sometimes “software 2.0.” The two knew one another from Afara Websystems, a chip startup Olukotun had founded in 2000 to capitalize on his idea for a multibrained chip built to keep pace with web traffic and the demands of heavily used servers. Olukotun had hired Liang to oversee not just the production of the novel chips but also the development and the build-out of a complete system. “Kunle and I see the world in a very similar way,” says Liang, who has worked on high-end computers since his first job at Hewlett-Packard in 1993. “You can’t just build these individual components without having the stack.” At Afara, the team’s ambitions collided with the dot-com crash of the early 2000s, when money for startups dried up. In 2002, Sun Microsystems bought the company for $28 million. When Olukotun laid out his idea for a new, AI-native system designed expressly for data-intensive computing, Liang jumped at the opportunity. “Kunle is a brilliant mind when it comes to systems,” Liang says. “He’s usually many, many years ahead of everybody else in what’s coming next.” Liang took on the role of CEO. As the company’s chief technologist and chief scientific advisor respectively, Olukotun and Ré split time between teaching at Stanford and the company. Liang, who was born in Taipei but grew up in Brazil, chose the name SambaNova, which means “new dance” in Portuguese. “It’s a new dance,” Liang says. “It’s time for us to do something new in the world of technology.”
Quote Icon
It’s so important to understand where the technology is going, especially in a fast moving field like this one.”
SambaNova’s pioneering technology has had a proven impact with many high-profile customers. One early adopter, the U.S. Argonne National Laboratory, uses SambaNova’s systems for a variety of applications that range from improving weather forecasting to better predictions for how tumors respond to different drug combinations. The national lab has reported a tripling in performance speed. Says Vijay Tatkar, a director at SambaNova: “We’ve been able to train large language models six times faster than a Nvidia A100.” Other operations, he adds, have seen a 10x improvement in speed. This past September, SambaNova introduced a new chip, the SN40L, that wowed markets and the press, and vastly improved on the performance of its earlier models. The chip is designed to run AI models that are more than twice the size of the advanced version of OpenAI's ChatGPT and is tailor-made for LLMs in the enterprise, enabling companies to stand up their own ChatGPT equivalent in a matter of days. Its release, which coincided with a chip shortage that pushed customers to find new suppliers, further positioned the company as a direct competitor to Nvidia in the market for AI computing.

Life after ChatGPT

Prior to November 30, 2022, when SambaNova’s people knocked on the doors of the large enterprises they identified as the natural customers for its high-end systems, they were met by worries about an economic slowdown and skepticism around AI. Company executives wished for tailwinds that would help boost sales. Instead, they were met by hurricane forces. “Before November, almost every conversation with a customer started with us trying to convince them of AI’s usefulness,” says Sumti Jairath, a member of the company’s founding team and its chief architect. “Now our starting point of 100 percent of the conversations is the concrete ways they might use AI.” Potential customers who have toyed with ChatGPT and played with image-making diffusion models “intuitively understand AI’s potential,” Jairath says. “Now they want to know how it can help their enterprise solve business problems.” SambaNova had its successes prior to the release of ChatGPT. An impressive roster of government and business customers had bought or leased its DataScale product, a refrigerator-sized server rack loaded with 32 of its chips, all sharing the same power supply, cooling system, and networking infrastructure. In 2021, SoftBank Vision Fund 2 led a $676 million funding round, making SambaNova temporarily the world’s best funded AI startup, with a valuation of more than $5 billion. “Rodrigo always said if we ever did another hardware company after Afara, he would make sure that we did not run out of money,” Olukotun says. Two months prior to OpenAI’s unveiling of ChatGPT, SambaNova released the third generation of its signature chip. “They’ve developed really innovative hardware and software solutions that dramatically accelerate workloads and make them really attractive in all sorts of very different industries, from autonomous vehicles to financial services to oil and gas exploration,” says Amit Lubovsky, an investment director at SoftBank Investment Advisers and a SambaNova board member.
Blue AI Rectangle
Hand with key
Timetables were upended after November 30. One- and three-year projects were rendered obsolete. Prior to the launch of ChatGPT, it took months to close a deal. Since then, the excitement and urgency around generative AI has made the process much faster. “The velocity of the business has been supercharged,” Choy says. SambaNova became more focused. Previously, the company would stress that its chips were multipurpose. Its machines do seismic analysis for oil and gas companies, process satellite imaging for the U.S. government, and run scientific models on behalf of research institutions. But now all anybody wants to hear about is generative AI, and more specifically, LLMs, the technology behind ChatGPT and other conversational AI interfaces. “When November hit and we saw all the fervor around natural language processing, we leaned hard into the language side,” Liang says. “It was like, ‘Let us be your private GPT vendor.” The new SN40L chip bolsters that pitch even further. The interest in LLMs also caused SambaNova to reorient resources to focus on inference, the process of reasoning and making decisions based on available data. The company also launched a pretrained multilingual language model, BLOOMChat, for prospective customers to test the waters and fine-tune models with their own data. And it created a program that allowed startups to onboard onto the SambaNova platform and use it to develop new initiatives. “It’s a good thing to do for the entrepreneurial community and a smart way for SambaNova to stay on top of what’s going on in the market,” Lubovsky says.
Mastering the AI beat
SambaNova co-founders share insights and tactics

Software 2.0:
Coding in the AI Age

A New Tech Revolution

The Beauty of
AI Assistants

Betting on
Open Source

Keeping Up With a
Fast-Moving Industry

While the ChatGPT release was a watershed moment for the company, it did not change the trajectory of the system’s design or its conviction for how companies should incorporate AI. SambaNova had bet on the relevance of open-source models for a long time. That conviction only solidified as SambaNova worked to ensure open-source models ran as efficiently as possible on its systems. “Our focus was engaging with customers around the idea of using open models and training those models on their own data to provide the capabilities that they needed within their organizations,” Olukotun says. There were challenges. Expedited sales are great for the bottom line, but they’ve also taxed SambaNova’s supply chain. “We have a lot more work [to do] trying to accelerate material to get things out to customers faster,” Liang says. “We need to do a lot more work on software releases. We need more coverage with staffing that supports our customers.” SambaNova has also had to contend with the backlash sparked by media hype around generative AI. The same outlets that wrote so enthusiastically about it have since become focused on its limitations. “Just three or four months after it started, people were talking about the cracks,” Liang says. “It was like, ‘Oh, it’s not secure, it hallucinates, it’s not auditable.’” But those negatives became an advertisement, of sorts, for SambaNova’s open-source philosophy. Nearly all the LLMs on the market are based on a famous 2017 paper from researchers at Google, and all use the same general architecture. What differentiates most is the nature and quality of the data used to train a neural network.

Lessons from SambaNova on how to make the most of a booming AI market

  • 1

    Adjust your message. Prospective customers are no longer asking what AI can do for them. That has made room for SambaNova to talk about other topics: everything from security and accuracy concerns to how to deploy AI to their advantage.

  • 2

    Avoid distractions. The same core principles that drove SambaNova’s success before the release of ChatGPT guides its people today. The environment changed but not the company, its core offering, or its philosophy.

  • 3

    Stay focused on the right customers. Inbound contacts have increased more than fourfold. SambaNova chooses to work with those where it can provide genuine, highly differentiated value.

  • 4

    Lean into the prevailing narrative. As the world has become obsessed with LLMs, SambaNova leans into chat and conversational interfaces. The fuller sales pitches come once the company has a customer’s attention.

  • 5

    Don’t stop innovating. Disruptive technologies like AI evolve in waves. Rather than focusing on models from AI’s first wave, it’s focused on understanding how to harness models that will be needed for the second, third, or fourth waves.

“We tell companies, ‘You want GPT? We’ll give you your own that’s secure, private, and trained on your own data,’ ” Liang says. The company pretrains open-source models on behalf of its customers, depending on what industry they’re in. Auditing, accuracy, and alignment are more straightforward when a customer knows exactly what data is used to train a neural network. Importantly, SambaNova, unlike some of its competitors, encourages customers to retain control of their own data and models, protecting critical IP, a model that “really resonates with businesses,” Liang says. “We're in the business of building people's models as assets. As soon as we've trained these models for customers using their data, 100% of the customers want to own that model.” As interest in SambaNova’s technology expands geometrically, the company is working hard to avoid being pulled in directions that could be counterproductive. “In a market like this one that’s brand new and offers a breadth of opportunities, we’ve made sure that a level of discipline permeates the company,” Liang says. One requirement: spending more time with customers to ensure they’re reacting to the market as it exists and not as they imagine it. As a result, customer-facing employees find themselves on the road more often than not. “On the business side of things,” Choy says, “we are just spending inordinate amounts of time outside SambaNova and inside the walls of our customers and discussing, learning, and aligning with them.”

Walking the AI talk

A chart in the presentation SambaNova shares with customers plots the march of artificial intelligence, from machine learning to deep learning to generative AI and, ultimately, pervasive AI—the idea that AI will touch every part of the economy. As incredible as the advances have been, that last step remains a ways off, Liang says. Of the 2,000 largest companies on the planet, he adds, “99% are only talking and don’t really have anything AI in production.” That’s a clear signal to Liang and Olukotun that for all the hubbub, the market still lags the technology. But for SambaNova to successfully capitalize on the enormous opportunity, the company knows it needs to stay at the forefront of new developments. “It’s so important to understand where the technology is going, especially in a fast-moving field like this one,” Olukotun says. “Luck favors the prepared.” With its new intelligent AI chip, which can enable much faster and more scalable inference and training without sacrificing model accuracy, SambaNova certainly looks prepared for what’s to come.