San Francisco – March 5, 2025 – Ceramic.ai emerged from stealth today using software from the Basic Model Training Infrastructure, designed to allow businesses to build and fine-tune generation AI models more efficiently.
Founded by former Google VP Anna Patterson, founder of Engineering and Gradient Ventures, Ceramic.ai said it will improve training speeds and cost-effectiveness for AI models, providing up to 2.5x performance boost accelerated by NVIDIA.
Ceramic.ai also said it has secured $12 million in seed funding from NEA, IBM, Samsung Next, Earthshot Ventures and Alumni Ventures.
“During the surge in AI adoptions, too many companies are still hampered by barriers of scale — from exorbitant costs to limited infrastructure.” “We democratize access to high-performance AI infrastructure, so businesses can navigate the complexities of AI training without spending hundreds of millions of research and engineering resources. But the transition to enterprise AI is not just about changing the way businesses work. AI adoption is a
At baseball games, we were still singing the national anthem. ”
Global AI investments have experienced explosive growth from $16 billion in 2023 to an estimated $143 billion by 2027. The main challenge is that building AI infrastructure is expensive, complex and resource intensive. While the tech giant spends billions developing their own AI infrastructure, most companies don’t have the engineering resources to optimize and scale their own AI models.
Today’s AI infrastructure is not 100x, but not up to 10x, but exponential growth requires a complete redesign. Ceramic.ai bridges this gap by driving a faster, but essentially scalable, enterprise-ready platform into next-generation AI, dramatically reducing the complexity and cost of AI model training.
Software platform models can be trained in long contexts and any cluster size, allowing companies to develop, train and scale their own AI models faster than traditional methods. For small models, ceramic.ai is up to 2.5 times faster on Nvidia H100 GPUs than current cutting edge platforms, and for large, longer context models, ceramic.ai is the only viable choice for fast training.
Ceramic.ai has developed a comprehensive platform that addresses the central challenges of enterprise AI deployments.
● Speed and Efficiency: Ceramic.AI’s training infrastructure offers up to 2.5 times more efficiency than open source stacks, reducing training costs while improving model performance.
●Exclusive Long Context Training Function: ceramic.ai is the only platform that allows you to train large models with long context data, offering unparalleled quality and performance. The company outperforms all reported benchmarks for long context model training, and maintains high efficiency even in the 70B+ parameter model.
●Excellent inference model performance: Ceramic trained problem-solving inference models and achieved an accurate match-pass @1 score on the GSM8K of the LLAMA70B 3.3 base model with a 92% tuning meta, surpassing DeepSeek’s R1 84%.
● Optimized Data Processing: Ceramics reorganize the training data so that each microbatch is aligned by topic. Current approaches obscur other documents, lose the benefit of context length length, or pay attention to unrelated documents, and learn bad habits. Sort the training data to increase the number of data points that attention can be secondary-trained in the context of 64K or 128K, all in the same topic.
Built by a large team of infrastructure experts, Ceramic.AI already helps businesses demonstrate cost reductions and improved model training efficiency in early testing. They have partnered with Lambda, AWS and others for accelerated training.
“Ceramic.ai is a game changer for AI developers and businesses looking for improved efficiency and excellent price performance,” says Sam Khosroshahi of BD & Strategic Consuits I & Machine Learning at Lambda. “Combined, our products provide our customers with accelerated full stack solutions that are validated and supported by both infrastructure and model expertise, allowing customers to achieve faster results, reduce development costs and achieve high quality solutions.”
“The rise of AI meteors has been like a rocket that has been linked to horse-drawn carriages,” said Lila Treticov, partner and head of AI strategy at NEA. “Anna and her team at Ceramic.ai algorithmically crush important bottlenecks in model training, making them faster, more efficient and scalable. Ceramics allow businesses to scale their already large AI training workloads by 100 times.
“The investment in ceramics shows how IBM drives innovation and strengthens its partnerships in a highly strategic area,” said Emily Fontaine, vice president of IBM Global Venture Capital. “We are excited to work with Ceramic to address the critical needs of reducing the computational costs of AI and making training more efficient and accessible.”