Elam的caffe笔记之配置篇(六):Centos6.5下编译caffe及caffe的python3.6接口


配置要求:html

系统:centos6.5
目标:基于CUDA8.0+Opencv3.1+Cudnnv5.1+python3.6接口的caffe框架python


综合来讲,caffe的配置并无想象中的那么难。仍是那句话已官方文档为准,网上的教程很难找到彻底对应的。
Centos系统下配置caffe 的官方文档,
http://caffe.berkeleyvision.o...c++

1.安装前准备

通常依赖项:git

sudo yum install protobuf-devel leveldb-devel snappy-devel opencv-devel boost-devel hdf5-devel

剩下的依赖项:github

sudo yum install gflags-devel glog-devel lmdb-devel

以上就是caffe配置所须要的依赖包了,这里我采用的方法是所有手动安装,这样成功率要比直接用yum高上很是多.shell

① Protobuf

因为我配置的是python3.6的接口,所以protobuf的版本必须大于3.0以上
https://github.com/google/pro...
分别去下载cpppython的包。
我选的是3.2版本,所以下载了protobuf-cpp-3.2.0.tar.gzprotobuf-python-3.2.0.tar.gz包,选好目录bootstrap

tar -zxvf protobuf-cpp-3.2.0.tar.gz
cd protobuf-3.2.0
./configure
make
make check
make install
ldconfig
tar -zxvf protobuf-python-3.2.0.tar.gz

进入目录以后centos

cd python
python setup.py build
python setup.py test
python setup.py install

编译完成后能够用一下命令确认是否安装成功api

conda list | grep protobuf

② boost

http://www.boost.org/users/hi...
下载boost_1_65_0.tar.gzbash

tar -zxvf boost_1_65_0.tar.gz
cd boost_1_65_0
./bootstrap.sh
./b2
./b2 install

完成后 若发现没有libboost_python生成
从新

cd boost_1_65_0
./bootstrap.sh
./b2 –-with-python include=”你pyconfig.h的路径”←可用locate去寻找pyconfig.h的路径

在终端输入

locate libboost_python3

查看/usr/local/lib/下有没有

/usr/local/lib/libboost_python3.a
/usr/local/lib/libboost_python3.so
/usr/local/lib/libboost_python3.so.1.65.0

若是有,直接建立软连接

ln -s /usr/local/lib/libboost_python3.so.1.65.0 /usr/local/lib/libboost_python3.so

这三个若是没有,就从booststage文件夹下的lib文件夹当中把这三个文件拷贝到/usr/local/lib/目录下,而后建立软连接

③ glog gflags lmdb

这三个依赖项直接根据caffe的官方文档的命令进行安装编译便可

1.glog
wget https://storage.googleapis.com/google-code-archive-downloads/v2/code.google.com/google-glog/glog-0.3.3.tar.gz
tar zxvf glog-0.3.3.tar.gz
cd glog-0.3.3
./configure
make && make install
2. gflags
wget https://github.com/schuhschuh/gflags/archive/master.zip
unzip master.zip
cd gflags-master
mkdir build && cd build
export CXXFLAGS="-fPIC" && cmake .. && make VERBOSE=1
make && make install
3.lmdb
git clone https://github.com/LMDB/lmdb
cd lmdb/libraries/liblmdb
make && make install

④ hdf5

建议安装1.8.17版本,由于anaconda自带的hdf5也是这个版本
http://download.csdn.net/down...

tar -zxvf hdf5-1.8.17.tar.gz
cd hdf5-1.8.17
./configure --prefix=/usr/local/hdf5-1.8.17/
make
make check                
make install
make check-install

⑤ snappy

yum install snappy

⑥ leveldb

http://download.csdn.net/down...

tar –zxvf leveldb-1.7.0.tar.gz 
cd leveldb-1.7.0 
make
cp libleveldb* /usr/lib/
cp –r include/leveldb /usr/local/include

⑦atlas-devel

直接使用yum install atlas-devel 安装

caffe编译

1.caffe下载

git clone https://github.com/bvlc/caffe.git

2.caffe编译

cd caffe
vi Makefile

找到Configure build其下的
COMMON_FLAGS +=后面加上-I/usr/local/hdf5-1.8.17/include
LDFLAGS +=后面加上-L/usr/local/hdf5-1.8.17/lib
固然若是你以前路径配的都没问题的话,没能够不加
修改Makefile.config
若是没有Makefile.config,

cp Makefile.config.example Makefile.config
vi Makefile.config

如下是我修改后的完整的Makefile.config,左箭头(←)所指部分是须要修改的地方

## Refer to http://caffe.berkeleyvision.org/installation.html
# Contributions simplifying and improving our build system are welcome!

# cuDNN acceleration switch (uncomment to build with cuDNN).
 USE_CUDNN := 1←←←

# CPU-only switch (uncomment to build without GPU support).
# CPU_ONLY := 1

# uncomment to disable IO dependencies and corresponding data layers
# USE_OPENCV := 0
# USE_LEVELDB := 0
# USE_LMDB := 0

# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
#       You should not set this flag if you will be reading LMDBs with any
#       possibility of simultaneous read and write
# ALLOW_LMDB_NOLOCK := 1

# Uncomment if you're using OpenCV 3
 OPENCV_VERSION := 3←←←

# To customize your choice of compiler, uncomment and set the following.
# N.B. the default for Linux is g++ and the default for OSX is clang++
# CUSTOM_CXX := g++

# CUDA directory contains bin/ and lib/ directories that we need.
CUDA_DIR := /usr/local/cuda-8.0←←←
# On Ubuntu 14.04, if cuda tools are installed via
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
# CUDA_DIR := /usr

# CUDA architecture setting: going with all of them.
# For CUDA < 6.0, comment the *_50 through *_61 lines for compatibility.
# For CUDA < 8.0, comment the *_60 and *_61 lines for compatibility.
CUDA_ARCH := -gencode arch=compute_30,code=sm_30 \←←←
                -gencode arch=compute_35,code=sm_35 \
                -gencode arch=compute_50,code=sm_50 \
                -gencode arch=compute_52,code=sm_52 \
                -gencode arch=compute_60,code=sm_60 \
                -gencode arch=compute_61,code=sm_61 \
                -gencode arch=compute_61,code=compute_61

# BLAS choice:
# atlas for ATLAS (default)
# mkl for MKL
# open for OpenBlas
BLAS := atlas←←←
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
# Leave commented to accept the defaults for your choice of BLAS
# (which should work)!
# BLAS_INCLUDE := /path/to/your/blas
# BLAS_LIB := /path/to/your/blas

# Homebrew puts openblas in a directory that is not on the standard search path
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
# BLAS_LIB := $(shell brew --prefix openblas)/lib

# This is required only if you will compile the matlab interface.
# MATLAB directory should contain the mex binary in /bin.
# MATLAB_DIR := /usr/local
# MATLAB_DIR := /Applications/MATLAB_R2012b.app

# NOTE: this is required only if you will compile the python interface.
# We need to be able to find Python.h and numpy/arrayobject.h.
# PYTHON_INCLUDE := /usr/include/python2.7 \←←←
                /usr/lib/python2.7/dist-packages/numpy/core/include←←←
# Anaconda Python distribution is quite popular. Include path:
# Verify anaconda location, sometimes it's in root.
ANACONDA_HOME := /root/anaconda3←←←
PYTHON_INCLUDE := $(ANACONDA_HOME)/include \←←←
                 $(ANACONDA_HOME)/include/python3.6m \←←←
                 $(ANACONDA_HOME)/lib/python3.6/site-packages/numpy/core/include←←←

# Uncomment to use Python 3 (default is Python 2)
PYTHON_LIBRARIES := boost_python3 python3.6m←←←
# PYTHON_INCLUDE := /usr/include/python3.6m \←←←
                 /usr/lib/python3.6/dist-packages/numpy/core/include←←←

# We need to be able to find libpythonX.X.so or .dylib.
# PYTHON_LIB := /usr/lib←←←
PYTHON_LIB := $(ANACONDA_HOME)/lib \←←←
                $(ANACONDA_HOME)/pkgs/python-3.6.1-2/lib←←←

# Homebrew installs numpy in a non standard path (keg only)
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
# PYTHON_LIB += $(shell brew --prefix numpy)/lib

# Uncomment to support layers written in Python (will link against Python libs)
WITH_PYTHON_LAYER := 1←←←

# Whatever else you find you need goes here.
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include  /usr/local/hdf5-1.8.17/include←←←
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib64/atlas /usr/local/hdf5-1.8.17/lib←←←

# If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
# INCLUDE_DIRS += $(shell brew --prefix)/include
# LIBRARY_DIRS += $(shell brew --prefix)/lib

# NCCL acceleration switch (uncomment to build with NCCL)
# https://github.com/NVIDIA/nccl (last tested version: v1.2.3-1+cuda8.0)
# USE_NCCL := 1

# Uncomment to use `pkg-config` to specify OpenCV library paths.
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
# USE_PKG_CONFIG := 1

# N.B. both build and distribute dirs are cleared on `make clean`
BUILD_DIR := build
DISTRIBUTE_DIR := distribute

# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
# DEBUG := 1

# The ID of the GPU that 'make runtest' will use to run unit tests.
TEST_GPUID := 0

# enable pretty build (comment to see full commands)
Q ?= @

修改完成后保存退出。
主要须要指出的是
PYTHON_LIBRARIES : 赞成确保boost_python3这个动态连接库在ld.so.conf文件中(记住ldconfig)或LD_LIBRARY_PATH中能找到。
INCLUDE_DIRSLIBRARY_DIRS的话须要加上的路径是没有放在local或者usr文件夹下的include或者lib文件夹中的依赖项。
而后

make all -jn
make test -jn
make runtest -jn
make pycaffe -jn

在安装完成以后,若是想要导入caffePython模块,则添加模块路径到你的环境变量 $PYTHONPATH 中。好比在你的~/.bashrc中添加以下一行:

export PYTHONPATH=/path/to/caffe/python:$PYTHONPATH

打开终端

python
import caffe

若是没有错误,表示caffe的python接口配置完成

碰到的问题

[root@localhost caffe]# make runtest 
.build_release/tools/caffe
.build_release/tools/caffe: /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.15' not found (required by /home/HY/caffe/caffe/.build_release/tools/../lib/libcaffe.so.1.0.0)
.build_release/tools/caffe: /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.15' not found (required by /usr/local/lib/libopencv_core.so.3.1)
.build_release/tools/caffe: /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.15' not found (required by /usr/local/lib/libopencv_imgcodecs.so.3.1)
make: *** [runtest] Error 1

问题产生的缘由:新安装的高版本的gcc生成的动态库没有替换老版本的gcc的动态库致使的
解决方案参考这个博主写的博客:http://blog.chinaunix.net/uid...

Warning! ***HDF5 library version mismatched error***
The HDF5 header files used to compile this application do not match
the version used by the HDF5 library to which this application is linked.
Data corruption or segmentation faults may occur if the application continues.
This can happen when an application was compiled by one version of HDF5 but
linked with a different version of static or shared HDF5 library.
You should recompile the application or check your shared library related
settings such as 'LD_LIBRARY_PATH'.
You can, at your own risk, disable this warning by setting the environment
variable 'HDF5_DISABLE_VERSION_CHECK' to a value of '1'.
Setting it to 2 or higher will suppress the warning messages totally.
Headers are 1.8.3, library is 1.8.17

这个问题产生的缘由是hdf5版本不一致所形成的系统自己已经安装的是1.8.3版本,可是anaconda3所带的hdf5的版本的是1.8.17,因此在编译的时候会发生版本冲突。
为了解决这个兼容性问题,只能是从新编译hdf5-1.8.17版本,具体方法参考上文安装依赖项中的hdf5编译安装方法

[root@localhost caffe]# make pycaffe
CXX/LD -o python/caffe/_caffe.so python/caffe/_caffe.cpp
python/caffe/_caffe.cpp:1:52: fatal error: Python.h: No such file or directory
 #include <Python.h>  // NOLINT(build/include_alpha)
                                                    ^
compilation terminated.
make: *** [python/caffe/_caffe.so] Error 1

顾名思义找不到Python.h这个头文件,因而我利用

find / -name "Python.h"

去找文件的路径发现

/root/anaconda2/include/python2.7/Python.h
/root/anaconda2/pkgs/python-2.7.13-0/include/python2.7/Python.h
/root/anaconda3/include/python3.6m/Python.h
/root/anaconda3/pkgs/python-3.6.1-2/include/python3.6m/Python.h

能够看到第三个路径是上面修改Makefile.config时修改PYTHON_INCLUDE时应该要改的路径,打开Makefile.config到指定位置,果真发现本身的路径配置错误,当时指向了python3.6而不是python3.6m。改为python3.6m以后从新make再也不出现这个问题,所以这个问题出现的缘由就是python的路径配置错误。

The following directory should be added to compiler include paths:

    /home/HY/boost_1_59_0

The following directory should be added to linker library paths:

/home/HY/boost_1_59_0/stage/lib

这个问题是最先几回编译caffe的时候出现的问题,查了很久是由于在编译boost的时候没有把python模块编译出来,若是安装上文正确编译libboost_python3的话 并不会出现这个问题,出现这个问题的朋友,能够进入boost文件夹编译一下python模块,由于boost是能够重复编译的,命令参考上文

⑤ 编译boost库时

...failed gcc.compile.c++ bin.v2/libs/python/build/gcc-4.8.2/release/link-static/threading-multi/numpy/scalars.o...
gcc.compile.c++ bin.v2/libs/python/build/gcc-4.8.2/release/link-static/threading-multi/numpy/ufunc.o
In file included from ./boost/python/detail/prefix.hpp:13:0,
                 from ./boost/python/args.hpp:8,
                 from ./boost/python.hpp:11,
                 from ./boost/python/numpy/internal.hpp:17,
                 from libs/python/src/numpy/ufunc.cpp:8:
./boost/python/detail/wrap_python.hpp:50:23: fatal error: pyconfig.h: No such file or directory
 # include <pyconfig.h>
                       ^
compilation terminated.

这个问题和第四个问题出现的地方差很少,都是在编译boost的python模块的时候发生的,这个问题是由于编译的时候找不到pyconfig.h
解决方法:编译boost_python的时候use

./b2 --with-python include="path/to/pyconfig.h"

引号里面的路径能够利用

locate pyconfig.h

去肯定

⑥ 在make runtest的时候

error while loading shared libraries: libpython3.6m.so.1.0: cannot open shared object file: No such file or directory

顾名思义就是找不到共享库,那么首先咱们要确保/etc/ld.so.conf 里面有你本身共享库的路径
个人共享库以下
图片描述
上述错误顾名思义找不到libpython3.6m.so.1.0
解决方法利用find / -name XXX.so去找到libpython3.6m.so.1.0所在位置
而后复制libpython3.6m.so.1.0/usr/local/lib目录下,或者直接把libpython3.6m.so.1.0所在的路径直接添加到/etc/ld.so.conf里,而后ldconfig

⑦ NVCC的警告

产生缘由好像在CUDA8.0之后把compute 20,21都弃用了,所以解决方法也很简单。

vi Makefile.conf

CUDA_ARCH
-gencode arch=compute_20,code=sm_20-gencode arch=compute_20,code=sm_21直接去掉

相关文章
相关标签/搜索