(March 29, 2018)
NVIDIA is upping their game with GPU's meant for "machine learning" computational workloads. Such systems are installed in cloud server facilities to assist with Machine Learning computations. What NVIDIA is doing is to pivot their expertise in graphics processors into supplying high performance numerical calculation engines. These systems pack 2 petaflops of computation ability into a relatively small box, that also contains some conventional CPU's and a massive amount of SSD-based storage. The need being fulfilled is the massive computation required to implement artificial intelligence algorithms.
NVIDIA claims the improvement rate for this product line is faster than Moores Law - meaning that system capabilities are doubling faster than every 18 months - meaning that a revolution is underway in areas requiring this sort of massive computation capability.