Set to revolutionize AI services sovereign to the EU with NVMe over TCP software-defined data platform

Lightbits Labs (Lightbits®), the inventor of NVMe® over TCP and pioneer in modern, software-defined cloud data platforms, today announced that Nebul, has chosen Lightbits as the storage foundation on which to build cutting-edge AI cloud services. Nebul’s service helps companies across the EU unify their data, deploy NVIDIA-based Private AI, and extract actionable insights from their data through a robust and protected cloud infrastructure. Lightbits enables the AI cloud service provider to offer their customers 16x better performance at significantly lower costs than the alternatives while at the same time supporting their sustainability mission. The high performance and low latency capabilities of Lightbits, combined with the versatility in compatibility with common orchestration environments–including Kubernetes, VMware, OpenShift, and OpenStack–make it an ideal choice for cloud builders or service providers supporting diverse performance-sensitive workloads at scale.

Nebul chose Lightbits for its superior performance, resilience, and comprehensive data services, which include three-way replication and QoS. “We considered the usual storage providers, but even the software-defined ones tend to have a ‘closed recipe’ now in terms of infrastructure. We prefer to build with software that’s open, innovative, and especially performant,” said Arnold Juffer, CEO and Founder of Nebul. “With Lightbits as part of our platform, we can offer our customers 16 times the performance at half the cost of AI services from the hyperscalers.” This remarkable efficiency allows Nebul to improve large language model (LLM) performance and handle latency-sensitive workloads under stress without compromising cost or reliability.

As a service provider, Nebul realized their customers would require block storage for latency-sensitive workloads commonly found in AI, such as RAG model training and inference. These workloads are built on vector, real-time, and other NoSQL databases that demand high performance at scale. The Lightbits cloud data platform scales beyond the petabyte level and delivers performance of up to 75 million IOPS and consistent sub-millisecond tail latency even under a heavy load. This exceptional performance profile makes it the ideal solution for vector and other AI-oriented databases whether they manage real-time AI application data or store training parameters and tags.

"With Lightbits in place, Nebul can now provide EU-based organizations with a specialty AI cloud that runs on NVIDIA AI Enterprise with certified tools, frameworks, and AI apps to meet the demands of today’s most intensive applications,” commented Eran Kirzner, CEO and co-founder of Lightbits. “The data platform is incredibly flexible running in container environments, like Kubernetes and Azure Kubernetes Solution (AKS), delivering accelerated performance and efficiency for cloud-native applications at scale. We attribute this versatility combined with the unparalleled speed, scalability, and cost-efficiency to the increase in interest and cloud service use cases, like AI.”

The Lightbits cloud data platform enables enterprises to build AI clouds with extreme performance at scale to capitalize on rapidly expanding AI opportunities.

“Nebul evaluated many systems and conducted rigorous testing seeking a solution that could handle extreme conditions and high loads without performance degradation. Lightbits emerged as the clear leader, meeting Nebul's stringent requirements and providing a platform for continuous innovation,” added Jos Keulers, Founder of NVMestorage.com, a Lightbits Luminary Leader Partner. “With a solution attuned to the performance and scalability requirements of AI, we’re in a great position to help our customers architect a future-proofed data platform with the ability to support modern AI workloads.”

To learn more about how Nebul built an AI Cloud service with Lightbits, read the case study.

Additional resources:

  • Lightbits Data Platform for AI and Machine Learning
  • Traffic lights, AI and everything in between…(an introduction to software-defined storage for RAG)
  • The Rise of High-Performance Cloud Storage for AI

About Lightbits Labs

Lightbits Labs® (Lightbits), invented the NVMe over TCP protocol and are pioneers in software-defined block storage that enables organizations to architect IT infrastructure with cloud-like efficiency. Built from the ground up to solve the common data center challenges of agility, scalability, and efficiency while delivering the highest performance and lowest latencies for business-critical workloads at scale. Lightbits is backed by enterprise technology leaders [Cisco Investments, Dell Technologies Capital, Intel Capital, Lenovo, and Micron] and is on a mission to deliver a robust cloud storage platform with unmatched performance, cost-efficiency, agility, and flexibility.

Lightbits and Lightbits Labs are registered trademarks of Lightbits Labs, Ltd.

All trademarks and copyrights are the property of their respective owners.

Lightbits PR Contact: Carol Platz pr@lightbitslabs.com