Comprehensive guide: Build from source on Ubuntu 16.04 to install GPU-enabled CAFFE2

Source: Internet
Author: User
Tags pytorch cuda toolkit

Comprehensive Guide: Install the Caffe2 translator with GPU support from source on Ubuntu 16.04:

Originally from: https://tech.amikelive.com/node-706/ Comprehensive-guide-installing-caffe2-with-gpu-support-by-building-from-source-on-ubuntu-16-04/?tdsourcetag=s_ Pctim_aiomsg, have to say that the author's knowledge is rich, the research is more thorough, the environment configuration explained more detailed. Personally think this blog value is relatively large, perhaps this blog will provide some help to the deep learning workers, so intend to translate down, and try to restore the original meaning. If there is infringement, please contact me.

With infrastructure settings (Translator Note: Deep environment), we can easily start delving into deep learning, building, training and validating deep neural network models, and applying models to specific problem domains. Converting deep learning primitives to low-level bytecode execution can be a daunting task, especially for practitioners who have no interest in deep learning computing. Fortunately, there are several deep learning frameworks that provide advanced programming interfaces to help perform deep learning tasks. (i.e. deep learning platform)

In this article, we will describe the installation of Caffe2, which is a major deep learning framework. Caffe2 uses Caffe, a deep learning framework developed by the University of California, Berkeley Barkeley Vision and Learning Center (BVLC). Caffe2 's start was designed to improve Caffe, especially in order to better support large-scale distributed model training, mobile deployments, lower precision calculations, new hardware, and the flexibility to migrate to multiple platforms.

Why install from source?

It is important to note that in addition to installing from the source, there are some simpler options, such as pulling and running the Caffe2 Docker image. The Docker repository for the Caffe2 image can be accessed here. However, if you watch the list, image (Docker) is not up-to-date, and they are built for the older version of Cuda Toolkit and CUDNN. Therefore, people who intend to use the latest Caffe2,cuda or CUDNN features should consider installing from the source code.

Caffe2 is now part of the Pytorch.

The Caffe2 was previously maintained by a separate library. Since the end of March 2018, Caffe2 has been merged into the Pytorch Warehouse, (Translator note: Now can not be compiled by a separate caffe2 source code to get Caffe2, has been unable to clone all the modules down). So Caffe2 's co-construction process has been integrated into the pytorch.

There may be a question about the motive for source code merging. As explained in the announcement article, the consolidation was preceded by a development infrastructure share between CAFFE2 and Pytorch, resulting in an increasing number of shared code and public libraries. It was then concluded that merging two projects into one code base would improve engineering efficiency and improve the robustness of the framework, especially in the area of model deployment.

What if I have previously installed Caffe2? When Caffe2 is installed before the source code is merged, the build process outputs the header files, dynamic libraries, and Python libraries that are then copied to the specified directory. In theory, replacing the old header and library files with the latest version of the file is sufficient. If it does not work, remove the old files from the installed directory before proceeding with the installation, and then clean up

Prerequisite

We will install the GPU-enabled Caffe2. This article describes the installation process only for computers that have an NVIDIA GPU. Verify that the items are met before you complete the installation.

    1. Verify that the NVIDIA graphics driver is properly installed.

      You can refer to this document for installation and feature checking of Nvidia graphics drivers.

    2. Verify that the CUDA Toolkit is installed.
      If you have not installed the Cuda Toolkit, are unsure whether it is installed, or if you want to install a newer version of the CUDA Toolkit, see this article, which provides more details about the CUDA Toolkit installation.

    3. Verify that CUDNN is installed. For more details about the CUDNN installation,
      Please see this article

    4. Removing files from a previous Caffe2 installation
      If you have previously installed Caffe2 and want to upgrade, you may need to clear the files that were created and copied in the previous Caffe 2 installation.
      Note : This can usually be skipped because the build tool will first check the existing version of the CAFFE2 installed on the system, and then replace it with the new version

sudo rm -vRf /usr/local/include/caffe2sudo rm -vf /usr/local/lib/libcaffe2*

In addition, we may need to remove the Python API from the Python package directory (the path may vary depending on the version of Python installed).

sudo rm -vRf /usr/local/lib/python2.7/dist-packages/caffe2

After checking all the prerequisites, we can now proceed with the installation.

Installation steps

You can complete the installation by performing these steps sequentially.

    1. Update APT Package index
sudo apt-get update
    1. Install APT Package dependent
sudo apt-get install -y --no-install-recommends build-essential cmake git libgoogle-glog-dev libgtest-dev libiomp-dev libleveldb-dev liblmdb-dev libopencv-dev libopenmpi-dev libsnappy-dev libprotobuf-dev openmpi-bin openmpi-doc protobuf-compiler python-dev python-pip
    1. Install PIP dependency
sudo pip install --upgrade pipsudo pip install setuptools future numpy protobufsudo apt-get install -y --no-install-recommends libgflags-dev
    1. Cloning Caffe 2 to a local directory

Note : We create a directory named Caffe2-pytorch and clone pytorch git repository into this directory.

mkdir caffe2-pytorch && cd caffe2-pytorchgit clone --recursive https://github.com/pytorch/pytorch.git ./git submodule update --init
    1. Established Caffe 2
mkdir build && cd buildcmake ..sudo make -j"$(nproc)" install

Note : When building the source code, we provide the -j flag. This flag refers to the number of threads that the compiler (that is, GCC) can generate when building the source code. The nproc command itself prints the number of available CPUs. In short, we want to speed up compilation time by creating multiple threads that are equivalent to the amount of CPU that is compiled in parallel.

    1. Creating Symbolic Links for Caffe 2 shared libraries
sudo ldconfig
    1. Verify that the Caffe2 library files and header files are installed.
    • Update Local Database
sudo updatedb
    • Make sure the libcaffe2.so/usr/local/lib
locate libcaffe2.so
    • Verify that the header file is not in the /usr/local/include
locate caffe2 | grep /usr/local/include/caffe2
    1. Add Caffe2 to the Python library path so that other apps can find the file through the path
vim ~/.profile# set python pathif [ -z “$PYTHONPATH” ]; then    PYTHONPATH=/usr/localelse     PYTHONPATH=/usr/local:$PYTHONPATHfi#set library pathif [ -z “$LD_LIBRARY_PATH” ]; then    LD_LIBRARY_PATH=/usr/local/libelse    LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATHfi

We can also use parameter extensions to set the library path as follows:

PYTHONPATH=/usr/local${PYTHONPATH:+:${PYTHONPATH}}LD_LIBRARY_PATH=/usr/local/lib${LD_LIBRARY_PATH:+:${LD_LIBRARY_PATH}}

Let the new path take effect

source ~/.profile
    1. Verify that the Caffe2 Python module can be called correctly
$ python -c 'from caffe2.python import core' 2>/dev/null && echo "Success" || echo "Failure"
    1. Verify that the Caffe2 can run under GPU support
python2 -c 'from caffe2.python import workspace; print(workspace.NumCudaDevices())'

Comprehensive guide: Installing a GPU-enabled Caffe2 from source on Ubuntu 16.04

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.