Nvidia rents its AI Superpod platform for $ 90,000 per thirty days

94 0

Nvidia aims to make artificial intelligence work and development more accessible and provide researchers with easy access to its DGX 2 supercomputer. The company announced that it will introduce a subscription service for its DGX Superpod to provide affordable entry into the world of supercomputers.

According to the company, the DGX 2 is capable of two petaflops of AI performance and is equipped with 16 Nvidia V100 Tensor Core GPUs that were developed for large AI projects.

Despite the company’s marketing approach to the new subscription service, affordable is still relative, as Nvidia’s new Superpod subscription is still $ 90,000 a month when it launches this summer.

The Nvidia Base Command Platform is powered by its DGX computers, and Nvidia works with NetApp for the storage. Nvidia also announced that it is partnering with Amazon Web Services and Google Cloud for instances within the cloud. The company claims the hybrid experience will allow developers to schedule jobs on-premise or in the cloud.

With the use of the cloud and a subscription model, AI researchers now only need a smaller form factor that is easier to fit into a server, and Nvidia is introducing its new AI service as part of the company’s efforts to democratize working with artificial intelligence Invoice. In the consumer space, Nvidia also relies on the cloud to bring the power of its graphics technologies to consumers who may not be able to purchase, operate or afford their own discrete GPU setup for gaming through its GeForce Now service.

For reference, Nvidia’s AI-powered DGX 2 supercomputer retailed at $ 399,000 and was known as the world’s largest GPU, while the newer, more powerful DGX A100 starts at $ 199,000 and f5 petaflops in AI performance.

The company claims that this new subscription model will allow you to “experience the best of Nvidia software and hardware that is easy for you to use” with no contractual obligations. It is designed to test Nvidia’s solutions.

Nvidia’s subscription-based model will also include its Bluefield 2 data center processing unit (DPU) with each DGX.

“A new type of processor developed for processing data center infrastructure software is needed to relieve and accelerate the enormous processing load of virtualization, networks, storage, security and other cloud-native AI services,” said Jensen Huang, Nvidia CEO, on his company’s DPU earlier this year when the company unveiled its plans for Bluefield 3. “The time for BlueField DPU has come.”

Developers get access to Nvidia’s AI Enterprise software, an open source stack that the company has integrated into a coherent platform with a special focus on enterprise support. The AI ​​Enterprise software also offers deep integration with vSphere from VMWare. Customers also have access to Nvidia Omniverse Enterprise software.

A specific launch date has not been announced, but the company said all of this will come this summer.

In data centers, Nvidia would like to expand the ARM ecosystem beyond pure mobile use. Starting next year, Nvidia announced that it would focus its work on bringing the ARM architecture into data centers. Given that much of the AI ​​work is already being done by the GPU, Nvidia claims that the role of the traditional CPU will become that of a data orchestrator rather than heavy-duty computing.

By acquiring Arm, the company hopes to help transform the data center for AI workloads into an equally powerful, energy-efficient solution. Nvidia executives had already indicated their ambitions with the ARM architecture at the beginning of the year when they announced their Grace supercomputer.

Follow Digital Trends for the latest news from Computex.

Editor’s recommendations




Leave a Reply