Lambda Raises $320 Million to Expand AI Cloud Business

Lambda, the leading GPU cloud company, has secured $320 million in a recent Series C funding round. The investment was led by US Innovative Technology and included participation from B Capital, SK Telecom, T. Rowe Price Associates, and other prominent investors.

This injection of funds will enable Lambda to further expand its AI cloud business, which includes its popular on-demand and reserved cloud offerings. With over a decade of experience in building AI infrastructure at scale, Lambda has become a trusted provider of NVIDIA H100 Tensor Core GPUs, offering AI developers the fastest access to the latest architectures for training, fine-tuning, and inferencing of generative AI and large language models.

In addition to serving over 100,000 customer sign-ups on Lambda Cloud, the company also caters to more than 5,000 customers across various industries, including manufacturing, healthcare, pharmaceuticals, financial services, and the U.S. government. Its AI Cloud has been adopted by renowned companies and research institutions such as Anyscale, Rakuten, and The AI Institute.

Lambda CEO and co-founder, Stephen Balaban, emphasized the transformative power of AI and the increasing demand for GPU compute. He stated, “This latest financing supports our mission to make GPU compute as ubiquitous as electricity.” The funding will allow Lambda to meet the growing need for GPU resources as AI continues to reshape science, commerce, and industry.

Thomas Tull, Chairman of USIT, highlighted the importance of investing in strong infrastructure to maintain the United States’ leadership in AI advancements. He praised Lambda’s unique combination of hardware, cloud infrastructure, and connective software, stating that the company’s platform will serve as the foundation for future AI hyperscalers.

Lambda’s dedication to innovation is evident in its deployment of cutting-edge technologies. The company recently became one of the first public clouds to utilize NVIDIA H100 GPUs and GH200 Superchip-powered systems. Despite the rising demand for generative AI, Lambda has consistently offered high availability of the latest NVIDIA GPUs at competitive prices.

In a rapidly evolving AI landscape, Lambda’s strategic partnership with Anyscale has proven crucial. By leveraging Anyscale’s open-source framework, Ray, Lambda has enabled customers to easily access NVIDIA GPUs for large-scale distributed training and inference, ultimately driving progress in AI workloads.

Lambda’s latest funding round positions the company for continued growth and reinforces its commitment to providing accessible and performant cloud infrastructure tailored to the demands of AI. With this significant investment, Lambda is well-positioned to shape the future of AI and facilitate technological advancements across industries.

Lambda, the leading GPU cloud company, has secured $320 million in a recent Series C funding round. The investment was led by US Innovative Technology and included participation from B Capital, SK Telecom, T. Rowe Price Associates, and other prominent investors.

The company plans to use the funds to expand its AI cloud business, including its on-demand and reserved cloud offerings. With over a decade of experience in building AI infrastructure, Lambda is a trusted provider of NVIDIA H100 Tensor Core GPUs, offering AI developers fast access to the latest architectures.

Lambda serves over 100,000 customer sign-ups on its Lambda Cloud platform and caters to more than 5,000 customers across various industries, including manufacturing, healthcare, pharmaceuticals, financial services, and the U.S. government. Renowned companies and research institutions such as Anyscale, Rakuten, and The AI Institute have adopted Lambda’s AI Cloud.

Lambda CEO, Stephen Balaban, emphasized the transformative power of AI and its growing demand for GPU compute. He stated that the funding will support their mission to make GPU compute as ubiquitous as electricity.

Thomas Tull, Chairman of USIT, highlighted the importance of investing in strong infrastructure to maintain the United States’ leadership in AI advancements. He praised Lambda’s combination of hardware, cloud infrastructure, and software, stating that the company’s platform will serve as the foundation for future AI hyperscalers.

Lambda has deployed cutting-edge technologies, being one of the first public clouds to utilize NVIDIA H100 GPUs and GH200 Superchip-powered systems. The company offers high availability of the latest NVIDIA GPUs at competitive prices.

Lambda’s strategic partnership with Anyscale, leveraging their open-source framework Ray, has enabled customers to access NVIDIA GPUs for large-scale distributed training and inference, driving progress in AI workloads.

With this funding round, Lambda is well-positioned for continued growth and aims to provide accessible and performant cloud infrastructure tailored to the demands of AI.

Key terms:
1. GPU: Stands for Graphics Processing Unit, a specialized electronic circuit that accelerates the creation of images, animations, and video content.
2. AI: Stands for Artificial Intelligence, a field of computer science that focuses on developing systems that can perform tasks that typically require human intelligence.
3. Cloud: Refers to the delivery of computing services, including storage, software, and processing power, over the internet.
4. Tensor Core GPUs: Graphics Processing Units that are designed for deep learning workloads and offer accelerated matrix operations performance.
5. AI infrastructure: The underlying hardware, software, and networking components required to support the development and deployment of AI systems.

Related links:
1. Lambda Official Website
2. NVIDIA Data Center Solutions
3. Anyscale Official Website

The source of the article is from the blog dk1250.com

Privacy policy
Contact