

The rise of edge AI is largely driven by the growing demand for user-facing inference power, as organizations increasingly look to deploy AI algorithms and models on on-premises edge devices.
Currently, the AI chip landscape is largely dominated by a handful of vendors developing accelerators to support growing workloads in data centers. As the edge grows, however, there are opportunities for new companies to branch out from data center architecture.
SiMa (pronounced See-ma) is one such company currently making waves in the edge AI market, hoping to carve out a niche in the space above lower-powered devices like phone chips and below significantly more powerful data center equipment. Founded in 2018 by former Groq COO Krishna Rangasayee, the San Jose, California-based company targets the 5W to 25W power consumption market segment. It has so far developed a machine learning system on a chip (MLSoC) that it claims allows customers to run entire applications on a chip. To date, the company has raised $270 million in funding and launched the second generation of its offering in September 2024, with samples set to be available to customers in the fourth quarter of this year.
While not all of SiMa’s customers have been disclosed, it announced a partnership with manufacturing organization TRUMPF earlier this month, which will use its chips to develop AI-powered lasers. When DCD spoke to Gopal Hegde, SiMa’s senior vice president of engineering and operations, in early July 2024, he revealed that the second-generation chip based on TSMC’s 6nm process technology would be arriving at the company within days.
Hegde is a chip industry veteran and serial entrepreneur who joined SiMa four years ago after a stint at Marvell, which had acquired a startup he had previously worked for. He tells us that SiMa is specifically targeting the integrated edge market, which the company describes as the layer between the cloud and personal devices.
SiMa believes this segment is currently worth about $40 billion, and within that market the company is looking at applications in healthcare, smart retail, autonomous vehicles, government, and robotics. “AI has really taken off in the data center and the cloud, but it’s taken almost 10 years for that to happen,” Hegde says. He says that’s also happened with Edge AI, where a single set of requirements has caused the industry to move relatively slowly. Hegde identifies three main challenges: cost, ease of implementation, and lack of expertise. Hegde says SiMa is different from other edge AI companies on the market because, in addition to trying to address cost and expertise issues, its Palette software solution provides a no-code approach to developing machine learning applications. “We’re really focusing on the software infrastructure needed to implement AI and machine learning, which is the main difference between us and a lot of other companies,” he says. “A lot of our competitors make great silicon, and in many cases their silicon may be better than ours. But no one has software like ours, and no one is trying to solve the problem as comprehensively as we are to meet our customers’ needs.”
Those customer needs include the challenges of increasingly complex AI workloads, which Hegde says most chip vendors have responded to by “adding more hardware [to the workload] and hoping the problem goes away.”
Unfortunately, he says, that’s not an option for SiMa customers because they can’t deploy, say, Nvidia’s upcoming 1kW Blackwell GPU on Edge, since most devices deployed for Edge AI purposes have low single- or double-digit power consumption. Of course, Nvidia also has its own Edge offerings, including the 40-60W Nvidia A2 GPU. “We’re not making silicon more complex, but we’re improving the processing and machine learning capabilities,” Hegde says. “With our second-generation [chip], we have double the processing power of the first generation, so it can support much more complex applications, and the way we solve that is with software.”
He added that for SiMa, the “key innovation” has a lot to do with how it develops its construction toolchain software, which allows the company to run networks very efficiently without having to deploy more hardware. He says this approach is in direct contrast to that of some major chip companies, such as Nvidia, Intel, and AMD, which simply “add more GPU cores to the problem… or more expensive memory,” which ends up breaking the hardware. more complex, consume more power, and are more expensive. On the other hand, when implementing software, SiMa is much more efficient.
“We’re crossing our fingers to improve the performance per watt of this chip, and we’re seeing over 50 percent improvement,” Hegde says. “Compared to our previous generation on the emulation platform, over the life of the product over the last two years, we’ve improved performance by over 30 percent just by making software changes.”
As a result, the company expects to see performance per watt improve by about doubling over the next 12 months. Competing with Nvidia
It’s nearly impossible to talk about AI chips without mentioning Nvidia, and while Hegde says it’s hard not to see the GPU giant as a competitor simply because of its sheer dominance, the two companies ultimately appeal to two very different customer bases, as Nvidia’s lower power solution offers the same power consumption as SiMa’s vastly superior offering.
And while Nvidia has set benchmarking records for MLPerf performance for cloud workloads, Hegde says the company’s performance doesn’t stack up when it comes to Edge performance. In August 2023, SiMa made its first appearance at MLPerf in the v3.1 round and competed with Nvidia’s Jetson Xavier NX Kit (10-40W) in the closed Edge ResNet50 benchmark test. SiMa was able to demonstrate improved latency, power efficiency, and overall performance. “When [Nvidia] runs Edge workloads, they don’t actually run very well because they’re not optimized for Edge,” Hegde says. “So we went to MLPerf to basically compete with them, and in three submissions (SingleStream, MultiStream, and Offline), we actually beat them every time.”
Hegde says Nvidia is no longer participating in the closed Edge category, instead focusing its efforts on other projects where the company continues to set records. But while SiMa has already outpaced Nvidia in its own territory, unlike other cutting-edge AI chip startups that have begun to consider a future entry into the data center, Hegde notes that this is not a path SiMa is considering.
“Our ambition is to be a key player in the embedded edge market, and we want to get there by really addressing the three main issues we’ve been talking about: cost, ease of use, and deployment and acceleration of a full end-to-end application. “In all three cases, what we’re doing is very different from what Nvidia is doing, and very different from what all of our competitors are doing.”
“Many of our competitors are building great silicon, and in many cases their silicon may be better than ours. But no one has the software that we do, and no one is trying to solve the problem as comprehensively as we do to meet the needs of our customers.”customers.”
SiMa’s approach is in direct contrast to some of the major chip companies, like Nvidia, Intel, and AMD, who are simply adding more GPU cores to the problem… or more expensive memory.