Sign up for a Free 30 Day Trial of G Suite Business and get Free Admin support from Google Certified Deployment Specialists. No Credit Card Required.
The Google Cloud Platform has an exciting new solution to offer developers searching for machine learning functionality in a box. The arrival of Deep Learning Containers, or "DLC", as Google calls it, will give developers the chance to run an immediate environment for machine learning, either in a remote or local location. What's more, the Google Cloud Deep Learning Containers have also been pre-configured to run on popular frameworks like PyTorch and even Google's TensorFlow.
The Deep Learning Containers are available to purchase alongside the existing Deep Learning VMs in the Google portfolio. These are the GCP's range of Debian-based disk images that come equipped with access to NVIDIA GPUs for the management of machine learning frameworks. There are currently 14 of those VM images available, including Intel MKL with CUDA, Tensorflow, and PyTorch - the machine learning library for Python. However, some of the images currently available from Google are still advertised as "Experimental."
Right now, users can access up to 20 deep learning container images, including PyTorch, TensorFlow, R, and more, with access to GPU and CPU options depending on your needs. Other DLC options will be on the horizon.
Machine Learning in a Box with Google
According to the latest update from Google, the company is working on making sure that it has feature parity with all of the Deep Learning VM types on the market. One intriguing thing about this solution for today's developers is that you don't necessarily need to spend a lot of money to start using the ML images locally. Instead, each image available from Google today is established within a Python3 environment. There's also access to selected ML libraries and a Jupyter server which can run on an automated basis.
For those interested in accessing the Deep Learning Containers, you'll need to decide whether you want to use a GPU or CPU option, and you'll have to invest in making sure you have a working docker setup available, with access to a CUDA 10 GPU and an NVIDIA docker. This will ensure that you can run with acceleration functionality through the CUDA NVIDIA computing platform. Once you've installed your GCloud SDK, you'll be able to pull the image that you'd like to run into the system and get started.
One point to note, however, is that the images can be quite large, with the PyTorch image weighing in at about 8GB and TensorFlow offering about 6GB. There's also access available for Jupyter - the open source interactive tool for sharing and managing code, visualizations in ML, and text. This offering has quickly emerged as a standard in the data science community.
Google hopes, however, that developers today will want to have access to a beefier and stronger machine than they can get from a local offering. With the DLC solution, modern companies will have the scale and freedom to customise their container according to their individual needs, which could allow them to include their own python packages, upload information to the GCP container registry, and deploy them to GCP using one of various options available for container management. For instance, you might choose Cloud Run for serverless performance or the Google Kubernetes engine. You'll also be able to deploy to AI platform notebook instances too, which you can manage your AI through a Jupyter managed Notebook service.
Integration with Google Services is Also Available
It probably won't surprise you to learn that the functionality offered by the DLC solution from Google also comes with integration to other Google services. You can access CloudProc, BigQuery, and Cloud DataFlow for your machine learning needs. The container-focused approach and the option to begin running locally means that deep learning containers offer a simple way into the GCP environment for those interested in machine learning.
Microsoft and Amazon Web Services have been experimenting with their own machine learning options for a while now. However, Google is hoping that the simple offering in the GCP, combined with affinity in the Kubernetes environment, will provide a more appealing alternative to today's customers.
If you're interested in finding out more about creating derivative containers that you can deploy and manage with the Google cloud, you can learn about the new beta offering here.