Table of contents
1.
Introduction
1.1.
Why should you use DLAMI?
2.
What is Jupyter Notebook?
2.1.
Set up a Jupyter Notebook Server
3.
Upgrading to a New DLAMI Version
4.
Tips for Software Updates
5.
Frequently Asked Questions
6.
Conclusion
Last Updated: Mar 27, 2024
Easy

Using and Upgrading a DLAMI

Author vishal teotia
0 upvote

Introduction

DLAMI (Deep Learning AMI) is a one-stop-shop for deep learning in the cloud. In many Amazon EC2 regions, this customized machine instance is available in several instance types, from a CPU-only instance up to the latest high-power multi-GPU instance.

Amazon Machine Images (AMIs) are the information necessary to create a virtual server in the cloud known as an instance. AMI is specified when an instance is launched, and you can run as many instances as you need from that AMI. In addition, you can launch instances from as many different AMIs as you need.

Why should you use DLAMI?

You can train neural networks in two ways: on the CPU or on the GPU. All modern frameworks support the GPU for training since it shows much better cost/efficiency results than training with the CPU.

Some criteria need to be met in order to leverage these GPU advantages:

  1. Access to the GPU is required.
  2. The GPU drivers must be set up correctly.
  3. When training neural networks, you must have libraries that can use all the GPU power. A library must be compatible with the hardware and drivers listed in items #1 and #2.
  4. To use a neural network, you need a framework that has been compiled with your libraries.

 

To use GPU, you either need to download the source code and the library and then build it by yourself, or Download the pre-build version of the framework with GPU support, then install the required library and use it. There is one significant drawback in either case: both of them require some technical knowledge from users to begin using them. GPU-enabled NN frameworks are not widely used around the world because of this reason.

DLAMI is the first solution that includes everything that is needed out of the box:

  • drivers for the latest GPU from Nvidia.
  • Latest libraries (CUDA and CuDNN).
  • Pre-build frameworks that are built with GPU support.

Here is a list of the frameworks that are already integrated with the DLAMI and are ready to use: MXNet, Caffe, Caffe2, TensorFlow, Theano, CNTK, Torch, Keras.

What is Jupyter Notebook?

Jupyter Notebooks are open-source, interactive web applications that allow users to create and share documents that contain interactive calculations, code, images, etc. Data, code, and visualizations can all be gathered in a single notebook, where users can create interactive stories that can be edited and shared. 

 

Jupyter notebooks are widely used, well documented, and offer an easy-to-use interface for creating, editing, and running them. "The Notebook" runs as a web application called the "Dashboard," where users can open files and run code snippets. Users can view output in a neat and organized manner via the browser. The kernel is the other component of the notebook. During notebook execution, the kernel acts as a "computational engine". This is similar to a web server or back-end application. Python code is executed in the Jupyter notebook using the IPython kernel (Jupyter was previously called IPython notebook). Other kernels are available for other languages.

Set up a Jupyter Notebook Server

A Jupyter notebook server lets you generate and run Jupyter notebooks directly from a DLAMI instance. With Jupyter notebooks, you can use the AWS infrastructure and AWS packages within the DLAMI to conduct ML experiments for training and inference.

To set up a Jupyter notebook server, you must:

  • Configure the Jupyter notebook server on your Amazon EC2 DLAMI instance.
  • Configure your client so that you can connect to the Jupyter notebook server. We provide configuration instructions for Windows, macOS, and Linux clients.
  • Test the setup by logging in to the Jupyter notebook server.

As soon as you have the Jupyter server running, you can run the tutorials in your web browser. If you use the Deep Learning AMI with Conda or have set up Python environments, you can change the Python kernel within Jupyter notebooks. Choose the relevant kernel before trying to run framework-specific tutorials.

Upgrading to a New DLAMI Version

System images for DLAMI's deep learning frameworks, CUDA, and other software updates are regularly updated so that DLAMI can optimize its performance. To benefit from an update, you need to launch a new instance if you've been using DLAMI for some time. Datasets, checkpoints, and other valuable information would also need to be manually transferred. Using Amazon EBS, however, you may be able to retain your data and attach it to a new DLAMI. As a result, you can upgrade often, and transition your data as quickly as possible.

Note: Amazon EBS volumes must be in the same Availability Zone when you attach and move them between DLAMIs.

  1. Create a new Amazon EBS volume using the Amazon EC2console. 
  2. Connect your newly created Amazon EBS volume to your existing DLAMI.
  3. You can transfer your data, such as datasets, checkpoints, and configuration files.
  4. Launch a DLAMI.
  5. Remove the Amazon EBS volume from your old DLAMI.
  6. Connect the Amazon EBS volume to your new DLAMI. 
  7. If your data can be found on your new DLAMI, stop and terminate your old DLAMI. 

Tips for Software Updates

However, there is one caveat to be considered. You cannot just auto-update the frameworks to the latest version since all the frameworks are designed from scratch, and if you do, you might end up having a framework without GPU support. Make sure you update packages at your own risk. The migration to the new versions of the frameworks becomes more difficult since you need to move to the new AMI rather than updating the packages. Migrating to new AMIs can be a complicated process. Therefore, be sure to keep this in mind while creating the new instance. I would advise you to create a separate EBS for storing your data, which can be easily unmounted and used with the new instance using the new AMI.

Frequently Asked Questions

  1. What is a deep learning Ami?
    AWS Deep Learning AMI (DLAMI) offers comprehensive cloud solutions for deep learning. In most regions of Amazon EC2, you can choose from a variety of instance types, from a small CPU-only instance to a multi-GPU instance with high performance.
     
  2. What is the difference between SageMaker and EC2?
    The cost of SageMaker instances is currently 40% higher than that of Amazon EC2. Every time you start the machine, it takes five minutes. This breaks your workflow. SageMaker Studio appears to speed up this process, but with other drawbacks.
     
  3. What port does Jupyter run on?
    Port: 8888
     
  4. Can you create an AMI from a snapshot?
    In the navigation pane, choose Snapshots under Elastic Block Store to create an AMI from a snapshot. Select the snapshot and choose Create Image. To create your AMI, complete the fields in the Create Image from EBS Snapshot dialog box, then choose Create.

Conclusion

If you want to harness the power of GPU, DLAMI is the go-to solution in which libraries

are build with GPU support.

Check out this link if you are a Deep Learning enthusiast or want to brush up on your knowledge with Deep Learning blogs.

Do not stress about upcoming Campus Placements. Coding Ninjas has you covered. Visit this link for carefully crafted and designed course on on-campus placements and interview preparation.

Live masterclass