Sign up for a Free 30 Day Trial of G Suite Business and get Free Admin support from Google Certified Deployment Specialists. No Credit Card Required.
For a while now, the Google Cloud Platform team have been working to prove that they're not just creating a powerful product for cloud storage and data management, they're also offering an environment where companies can invest in AI and machine learning. Over the last couple of years, we've seen huge improvements in the way that Google supports artificial intelligence and machine learning, with new natural language processing solutions, advanced analytics and more.
As the GCP continues to invest in new tools and collaborations to make their AI initiative more compelling, Google is set to become the number one cloud provider for artificial intelligence. This would be a great accomplishment for the company, which claimed in recent years that it has been aiming to make AI more accessible for everyone.
Today, the Google Cloud AI-optimized environment ensures that businesses from every background can effectively train state-of-the-art models for machine learning at incredible speed and scale - all with minimal cost too. The inherent advantages of the Google Cloud Platform recently set the GCP apart with brand-new performance records during the latest round of the MLPerf benchmark competition. This competition is the industry standard for measuring machine learning performance.
Transforming the AI Environment
During the MLPerf competition, the Google Cloud Platform set three new performance records, all with results running on their cloud TPU version 3 Pods. These pods are the latest supercomputer generation built by Google specifically to address the demands of machine learning. The results also showcased the incredible speed of the Cloud TPU pods, with all the winning runs accessing less than 2 minutes of computing time overall.
Following the reveal of the latest MLPerf benchmark results, the Google Cloud has emerged as the first public cloud provider to outperform even on-premise systems when it comes to running industry-standard large-scale machine learning training workloads. In the SSD and Transform categories, the Cloud TPU v3 pods models at a rate of over 84% faster than the most competitive on-premise systems in the MLPerf competition (closed division).
The transform model architecture is currently at the heart of modern strategies for natural language processing, with Transformer solutions enabling major improvements in language modeling, machine translation and high-quality generation of text. Additionally, the SSD model architecture is commonly used for things like object detection, which is a common component of computer vision applications, including autonomous driving, medical imaging, and photo editing solutions.
To ensure that they could properly demonstrate the depth of the ML workloads that Cloud TPUs are able to run in the modern environment, Google also submitted results in the Mask R-CNN and NMT categories. The Mask-R-CNN is a model for image segmentation, while the NMT model is used for traditional neural machine translation.
Record Breaking Results with the GCP
The Google Cloud Platform provides today's business leaders with the flexibility they need to select the price point and performance strategy that works for their AI environment. The wide range of configurations available for the Cloud TPU pods, called slice sizes, indicate how anyone can find the right approach to AI development with the Google Cloud. There are even tutorials available from Google to help get you started.
The developing customer base for Google has already begun to see some significant benefits from the performance and scale of Cloud TPU pods. One company called Recursion Pharmaceuticals is now capable of training in just 15 minutes using cloud pods, compared to a full 24 hours on the local GPU cluster.
If deep-learning requirements are critical to your business, then you can contact a representative on the Google Cloud to start developing your TPU pod strategy today. Google has also announced that they're continuing on their mission to make their AI platform as advanced as possible. They want to create the best possible environment for companies that want to design and run machine-learning workloads for the future growth of their business.
The fact that Google now has record-breaking machine learning offerings to provide, which are even more efficient than on-premise solutions for AI means that they're already making great progress towards their goals. We can't wait to see what comes next from the GCP environment.