Tel Aviv, Israel-based Deci, a company developing a platform to optimize machine learning models, today announced that it raised $21 million in a series A round led by Insight Partners with participation from Square Peg, Emerge, Jibe Ventures, Samsung Next, Vintage Investment Partners, and Fort Ross Ventures. The investment, which comes a year after Deci’s $9.1 million seed round, brings the company’s total capital raised to $30.1 million and will be used to support growth by expanding sales, marketing, and service operations, according to CEO Yonatan Geifman.
Advancements in AI have led to innovations with the potential to transform enterprises across industries. But long development cycles and high compute costs remain roadblocks in the path to productization. According to a recent McKinsey survey, only 44% of respondents reported cost savings from AI adoption in business units where it’s deployed. Gartner predicts that — if the current trend holds — 80% of AI projects will remain “alchemy,” run by “[data science] wizards” whose talents “will not scale in the organization.”
Deci was cofounded in 2019 by Geifman, Ran El-Yaniv, and entrepreneur Jonathan Elial. Geifman and El-Yaniv met at Technion’s computer science department, where Geifman was a PhD candidate and El-Yaniv a professor. By leveraging data science techniques, the team developed products to accelerate AI on hardware by redesigning models to maximize throughput while minimizing latency.
“I founded Deci in 2019 with Professor Ran El-Yaniv and Jonathan Elial to address the challenges stated above. With our talented team of deep learning researchers and engineers, we developed an innovative solution — using AI itself to craft the next generation of AI. By utilizing an algorithmic-first approach, we focus on improving the efficacy of AI algorithms, thus delivering models that outperform the advantages of any other hardware or software optimization technology,” Geifman told VentureBeat via email.
Deci achieves runtime acceleration on cloud, edge, and mobile through data preprocessing and loading, automatically selecting model architectures and hyperparameters (i.e., the variables that influence a model’s predictions). The platform also handles steps like deployment, serving, and monitoring, continuously tracking models, and offering recommendations where customers can migrate to more cost-effective services.
“Deci’s platform offers a substantial performance boost to existing deep learning models while preserving their accuracy,” the company writes on its website. “It designs deep models to more effectively use the hardware platform they run on, be it CPU, GPU, FPGA, or special-purpose ASIC accelerators. The … accelerator is a data-dependent algorithmic solution that works in synergy with other known compression techniques, such as pruning and quantization. In fact, the accelerator acts as a multiplier for complementary acceleration solutions, such as AI compilers and specialized hardware.”
Machine learning deployments have historically been constrained by the size and speed of algorithms, as well as the need for costly hardware. In fact, a report from MIT found that machine learning might be approaching computational limits. A separate Synced study estimated that the University of Washington’s Grover fake news detection model cost $25,000 to train in about two weeks, and Google spent an estimated $6,912 training BERT.
Above: Deci’s backend dashboard.
Deci’s solution is an engine — Automated Neural Architecture Construction, or AutoNAC — that redesigns models to create new models with several computation routes, optimized for an inference device and dataset. Each route is specialized with a prediction task, and Deci’s router component ensures that each data input is directed via the proper route.
“[O]ur AutoNAC technology, the first commercially viable Neural Architecture Search (NAS), recently discovered DeciNets, a family of industry-leading computer vision models that have set a new efficient frontier utilizing only a fraction of the compute power used by the Google-scale NAS technologies, the latter having been used to uncover well-known and powerful neural architectures like EfficientNet,” Geifman said. “Such models empower developers with what’s required to transform their ideas into revolutionary products.”
The thirty-employee company, Deci, recently announced a strategic collaboration with Intel to optimize AI inference on the chipmaker’s CPUs. In addition to Intel, the startup says that “many” companies in autonomous vehicle, manufacturing, communication, video and image editing, and health care have adopted the Deci platform.
“Deci was founded to help enterprises maximize the potential of their AI-based solutions. Enterprises that are leveraging AI face an upward struggle, as research demonstrates that only 53% of AI projects make it from prototype to production,” Geifman said. “This issue can largely be attributed to difficulties navigating the cumbersome deep learning lifecycle given that new features and use cases are stymied by limited hardware availability, slow and ineffective models, wasted time during development cycles, and financial barriers. Simply put, AI developers need better tools that examine and address the algorithms themselves; otherwise, they will keep getting stuck.”
Deci has competition in OctoML, a startup that similarly purports to automate machine learning optimization with proprietary tools and processes. Other competitors include DeepCube, Neural Magic, and DarwinAI, which uses what it calls “generative synthesis” to ingest models and spit out highly optimized versions.
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more
Source: Read Full Article