Nvidia today launched a tool designed to help developers and data scientists build and test machine learning systems on their personal computers before moving into production on a more powerful machine.
Nvidia GPU Cloud provides researchers with software containers that are designed to provide developers with the fastest execution environment for the training of machine learning systems that utilize the chip maker's silicon. Those containers were already available for use with the DGX-1 and DGX Station computers, along with instances in the cloud with Nvidia Volta chips that run on Amazon Web Services.
But now customers can use them in consumer hardware. in this case, the Nvidia Titan chip series. Those high-end consumer GPUs will not provide as much firepower as a mbadively computer-oriented machine learning, but they are less expensive and more available.
Because the Nvidia GPU Cloud software is stored inside software containers, it is possible for developers to take systems that have trained on a personal machine and implement them more easily on one of the larger scale IA machines. of Nvidia, or in the cloud.
Overall, the move is supposed to help people get off the ground with machine learning systems and iterate faster into systems that could help them solve business problems and drive the AI field.
While large-scale technological giants have no problem launching dozens, but hundreds of GPUs in a single machine learning problem, developers and researchers will often begin testing their systems on smaller personal machines without so much firepower . This announcement should give them a speed boost.
The news is presented as part of the Conference on Neural Information Processing Systems (NIPS), which will be held this week in Long Beach, California. That show brings together some of AI's brightest minds to share key developments in their research.