HOME
  Security
   Software
    Hardware
  
FPGA
  CPU
   Android
    Raspberry Pi
  
nLite
  Xcode
   etc.
    ALL
  
English Translate 中文翻訳
LINK
BACK
 

[NEW] 2018/08/15

Raspberry Piで Caffe Deep Learning Frameworkで物体認識を行なってみるテスト Raspberry Piで Caffe Deep Learning Frameworkで物体認識を行なってみるテスト

(ラズパイで Caffe Deep Learning Frameworkを動かして物体認識を行なってみる)

Tags: [Raspberry Pi], [電子工作], [ディープラーニング]






● Raspberry Piで Caffe Deep Learning Frameworkを動かして物体認識を行なう

 Raspberry Piで Caffe Deep Learning Frameworkを動かせる事を知ったので動かしてみます。
 Caffeのビルドが必要なので下記でビルドします。


● Raspberry Piで Caffe Deep Learning Frameworkをビルドする方法

[NEW] 2018/08/04
【ビルド版】Raspberry Piで DeepDreamを動かしてキモイ絵をモリモリ量産 Caffe Deep Learning Framework
【ビルド版】Raspberry Piで DeepDreamを動かしてキモイ絵をモリモリ量産 Caffe Deep Learning Framework

  ラズパイで Caffe Deep Learning Frameworkをビルドして Deep Dreamを動かしてキモイ絵を生成する


●今回動かした Raspberry Pi Raspbian OSのバージョン

 RASPBIAN STRETCH WITH DESKTOP
 Version:June 2018
 Release date: 2018-06-27
 Kernel version: 4.14
pi@raspberrypi:~/pytorch $ uname -a
Linux raspberrypi 4.14.50-v7+ #1122 SMP Tue Jun 19 12:26:26 BST 2018 armv7l GNU/Linux


● Raspberry Piで Caffe Deep Learning Frameworkを動かして物体認識を行なう

Caffe - Brewing ImageNet

caffe/models/bvlc_reference_caffenet/
 BAIR/BVLC CaffeNet Model

# CAFFE_ROOT env
export $CAFFE_ROOT=/home/pi/caffe

# 認識に必要なデータをダウンロードする
cd $CAFFE_HOME/data/ilsvrc12
sh get_ilsvrc_aux.sh
# http://dl.caffe.berkeleyvision.org/caffe_ilsvrc12.tar.gz

# 認識に必要なモデルデータをダウンロードする
# RuntimeError: Could not open file ../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel

# モデルファイルをダウンロード bvlc_reference_caffenet.caffemodel
cd $CAFFE_HOME/models/bvlc_reference_caffenet
wget http://dl.caffe.berkeleyvision.org/bvlc_reference_caffenet.caffemodel

 サンプルプログラムの classify.pyの出力結果がバイナリ形式なのでテキスト形式に変更します。
    np.save(args.output_file, predictions)
を
    np.savetxt(args.output_file, predictions)
に書き換える。


# 例によって sedコマンドを使ってコマンドラインでサクッと書き換え
cd $CAFFE_HOME/python
sed -i -e "s/np.save(/np.savetxt(/g" classify.py

tail classify.py


 書き換え後
pi@raspberrypi:~/caffe/python $ nano classify.py
    # Save
    print("Saving results into %s" % args.output_file)
    np.savetxt(args.output_file, predictions)

● PHP言語で認識結果と認識物体リストを出力するアプリを作りました。

result_disp.php
<?php
$file = fopen("../data/ilsvrc12/synset_words.txt", "r");

$result = file_get_contents("result.txt");
$results = explode(" ", $result);
$cnt = count($results);
for ($i=0; $i<$cnt; ++$i)
{
    $line = fgets($file);

    $str_val = $results[$i];
    $float_val = floatval($str_val);
    if ($float_val > 0.01) {
        printf("%s - %5.2f\n", chop($line), $float_val);
    }
}
fclose($file);
?>
 result.txtが認識結果リストで、../data/ilsvrc12/synset_words.txtが認識物体リストです。順番通りに対応しています。
 上記の PHPのプログラムでは 1%より大きい認識物を表示します。

# Install PHP
sudo apt-get -y install php

 ~/caffe/pythonディレクトリで実行する。
python classify.py --raw_scale 255 {認識したい画像} result.txt

F-22_(戦闘機)
cd $CAFFE_HOME/python

# F-22 (戦闘機)
wget https://upload.wikimedia.org/wikipedia/commons/thumb/c/c3/F-22_Raptor_-_100702-F-4815G-217.jpg/320px-F-22_Raptor_-_100702-F-4815G-217.jpg -O f22.jpg

python classify.py --raw_scale 255 f22.jpg result.txt

php result_disp.php

n02690373 airliner -  0.06
n04266014 space shuttle -  0.03
n04552348 warplane, military plane -  0.75
n04592741 wing -  0.14
 F22の写真を戦闘機(military plane)として 75%で認識しました。

ボーイング737
# ボーイング737
wget https://upload.wikimedia.org/wikipedia/commons/thumb/a/a7/Air_Berlin_B737-700_Dreamliner_D-ABBN.jpg/320px-Air_Berlin_B737-700_Dreamliner_D-ABBN.jpg -O airplane.jpg

python classify.py --raw_scale 255 airplane.jpg result.txt

php result_disp.php

n02690373 airliner -  0.98
n04592741 wing -  0.02
 ボーイング737の写真を大型旅客機(airliner)として 98%で認識しました。

pi@raspberrypi:~/caffe/python $ python classify.py --raw_scale 255 airplane.jpg result.txt
CPU mode
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0816 13:52:51.040508 19416 net.cpp:51] Initializing net from parameters:
name: "CaffeNet"
state {
  phase: TEST
  level: 0
}
layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param {
    shape {
      dim: 10
      dim: 3
      dim: 227
      dim: 227
    }
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  convolution_param {
    num_output: 96
    kernel_size: 11
    stride: 4
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "pool1"
  top: "norm1"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "norm1"
  top: "conv2"
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "pool2"
  top: "norm2"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "norm2"
  top: "conv3"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "pool5"
  type: "Pooling"
  bottom: "conv5"
  top: "pool5"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "fc6"
  type: "InnerProduct"
  bottom: "pool5"
  top: "fc6"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "drop6"
  type: "Dropout"
  bottom: "fc6"
  top: "fc6"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc7"
  type: "InnerProduct"
  bottom: "fc6"
  top: "fc7"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "fc7"
  top: "fc7"
}
layer {
  name: "drop7"
  type: "Dropout"
  bottom: "fc7"
  top: "fc7"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc8"
  type: "InnerProduct"
  bottom: "fc7"
  top: "fc8"
  inner_product_param {
    num_output: 1000
  }
}
layer {
  name: "prob"
  type: "Softmax"
  bottom: "fc8"
  top: "prob"
}
I0816 13:52:51.061775 19416 layer_factory.hpp:77] Creating layer data
I0816 13:52:51.062077 19416 net.cpp:84] Creating Layer data
I0816 13:52:51.062284 19416 net.cpp:380] data -> data
I0816 13:52:51.062521 19416 net.cpp:122] Setting up data
I0816 13:52:51.062809 19416 net.cpp:129] Top shape: 10 3 227 227 (1545870)
I0816 13:52:51.063079 19416 net.cpp:137] Memory required for data: 6183480
I0816 13:52:51.063313 19416 layer_factory.hpp:77] Creating layer conv1
I0816 13:52:51.063376 19416 net.cpp:84] Creating Layer conv1
I0816 13:52:51.063416 19416 net.cpp:406] conv1 <- data
I0816 13:52:51.063591 19416 net.cpp:380] conv1 -> conv1
I0816 13:52:51.063997 19416 net.cpp:122] Setting up conv1
I0816 13:52:51.064280 19416 net.cpp:129] Top shape: 10 96 55 55 (2904000)
I0816 13:52:51.064563 19416 net.cpp:137] Memory required for data: 17799480
I0816 13:52:51.064801 19416 layer_factory.hpp:77] Creating layer relu1
I0816 13:52:51.065027 19416 net.cpp:84] Creating Layer relu1
I0816 13:52:51.065248 19416 net.cpp:406] relu1 <- conv1
I0816 13:52:51.065300 19416 net.cpp:367] relu1 -> conv1 (in-place)
I0816 13:52:51.065448 19416 net.cpp:122] Setting up relu1
I0816 13:52:51.065678 19416 net.cpp:129] Top shape: 10 96 55 55 (2904000)
I0816 13:52:51.065731 19416 net.cpp:137] Memory required for data: 29415480
I0816 13:52:51.065802 19416 layer_factory.hpp:77] Creating layer pool1
I0816 13:52:51.065951 19416 net.cpp:84] Creating Layer pool1
I0816 13:52:51.065991 19416 net.cpp:406] pool1 <- conv1
I0816 13:52:51.066138 19416 net.cpp:380] pool1 -> pool1
I0816 13:52:51.066207 19416 net.cpp:122] Setting up pool1
I0816 13:52:51.066332 19416 net.cpp:129] Top shape: 10 96 27 27 (699840)
I0816 13:52:51.066462 19416 net.cpp:137] Memory required for data: 32214840
I0816 13:52:51.066581 19416 layer_factory.hpp:77] Creating layer norm1
I0816 13:52:51.066633 19416 net.cpp:84] Creating Layer norm1
I0816 13:52:51.066762 19416 net.cpp:406] norm1 <- pool1
I0816 13:52:51.066810 19416 net.cpp:380] norm1 -> norm1
I0816 13:52:51.067032 19416 net.cpp:122] Setting up norm1
I0816 13:52:51.067232 19416 net.cpp:129] Top shape: 10 96 27 27 (699840)
I0816 13:52:51.067314 19416 net.cpp:137] Memory required for data: 35014200
I0816 13:52:51.067353 19416 layer_factory.hpp:77] Creating layer conv2
I0816 13:52:51.067499 19416 net.cpp:84] Creating Layer conv2
I0816 13:52:51.067538 19416 net.cpp:406] conv2 <- norm1
I0816 13:52:51.067678 19416 net.cpp:380] conv2 -> conv2
I0816 13:52:51.069715 19416 net.cpp:122] Setting up conv2
I0816 13:52:51.070214 19416 net.cpp:129] Top shape: 10 256 27 27 (1866240)
I0816 13:52:51.070528 19416 net.cpp:137] Memory required for data: 42479160
I0816 13:52:51.070804 19416 layer_factory.hpp:77] Creating layer relu2
I0816 13:52:51.071072 19416 net.cpp:84] Creating Layer relu2
I0816 13:52:51.071305 19416 net.cpp:406] relu2 <- conv2
I0816 13:52:51.071513 19416 net.cpp:367] relu2 -> conv2 (in-place)
I0816 13:52:51.071791 19416 net.cpp:122] Setting up relu2
I0816 13:52:51.072022 19416 net.cpp:129] Top shape: 10 256 27 27 (1866240)
I0816 13:52:51.072266 19416 net.cpp:137] Memory required for data: 49944120
I0816 13:52:51.072497 19416 layer_factory.hpp:77] Creating layer pool2
I0816 13:52:51.072733 19416 net.cpp:84] Creating Layer pool2
I0816 13:52:51.072957 19416 net.cpp:406] pool2 <- conv2
I0816 13:52:51.073163 19416 net.cpp:380] pool2 -> pool2
I0816 13:52:51.073388 19416 net.cpp:122] Setting up pool2
I0816 13:52:51.073585 19416 net.cpp:129] Top shape: 10 256 13 13 (432640)
I0816 13:52:51.073815 19416 net.cpp:137] Memory required for data: 51674680
I0816 13:52:51.074142 19416 layer_factory.hpp:77] Creating layer norm2
I0816 13:52:51.074390 19416 net.cpp:84] Creating Layer norm2
I0816 13:52:51.074621 19416 net.cpp:406] norm2 <- pool2
I0816 13:52:51.074827 19416 net.cpp:380] norm2 -> norm2
I0816 13:52:51.075084 19416 net.cpp:122] Setting up norm2
I0816 13:52:51.075275 19416 net.cpp:129] Top shape: 10 256 13 13 (432640)
I0816 13:52:51.075481 19416 net.cpp:137] Memory required for data: 53405240
I0816 13:52:51.075676 19416 layer_factory.hpp:77] Creating layer conv3
I0816 13:52:51.075914 19416 net.cpp:84] Creating Layer conv3
I0816 13:52:51.076117 19416 net.cpp:406] conv3 <- norm2
I0816 13:52:51.076169 19416 net.cpp:380] conv3 -> conv3
I0816 13:52:51.081418 19416 net.cpp:122] Setting up conv3
I0816 13:52:51.081956 19416 net.cpp:129] Top shape: 10 384 13 13 (648960)
I0816 13:52:51.082263 19416 net.cpp:137] Memory required for data: 56001080
I0816 13:52:51.082545 19416 layer_factory.hpp:77] Creating layer relu3
I0816 13:52:51.082803 19416 net.cpp:84] Creating Layer relu3
I0816 13:52:51.083045 19416 net.cpp:406] relu3 <- conv3
I0816 13:52:51.083295 19416 net.cpp:367] relu3 -> conv3 (in-place)
I0816 13:52:51.083513 19416 net.cpp:122] Setting up relu3
I0816 13:52:51.083729 19416 net.cpp:129] Top shape: 10 384 13 13 (648960)
I0816 13:52:51.083945 19416 net.cpp:137] Memory required for data: 58596920
I0816 13:52:51.084148 19416 layer_factory.hpp:77] Creating layer conv4
I0816 13:52:51.084429 19416 net.cpp:84] Creating Layer conv4
I0816 13:52:51.084744 19416 net.cpp:406] conv4 <- conv3
I0816 13:52:51.084982 19416 net.cpp:380] conv4 -> conv4
I0816 13:52:51.089143 19416 net.cpp:122] Setting up conv4
I0816 13:52:51.089643 19416 net.cpp:129] Top shape: 10 384 13 13 (648960)
I0816 13:52:51.089953 19416 net.cpp:137] Memory required for data: 61192760
I0816 13:52:51.090211 19416 layer_factory.hpp:77] Creating layer relu4
I0816 13:52:51.090454 19416 net.cpp:84] Creating Layer relu4
I0816 13:52:51.090687 19416 net.cpp:406] relu4 <- conv4
I0816 13:52:51.090927 19416 net.cpp:367] relu4 -> conv4 (in-place)
I0816 13:52:51.091187 19416 net.cpp:122] Setting up relu4
I0816 13:52:51.091405 19416 net.cpp:129] Top shape: 10 384 13 13 (648960)
I0816 13:52:51.091612 19416 net.cpp:137] Memory required for data: 63788600
I0816 13:52:51.091809 19416 layer_factory.hpp:77] Creating layer conv5
I0816 13:52:51.092072 19416 net.cpp:84] Creating Layer conv5
I0816 13:52:51.092314 19416 net.cpp:406] conv5 <- conv4
I0816 13:52:51.092589 19416 net.cpp:380] conv5 -> conv5
I0816 13:52:51.095367 19416 net.cpp:122] Setting up conv5
I0816 13:52:51.095888 19416 net.cpp:129] Top shape: 10 256 13 13 (432640)
I0816 13:52:51.096218 19416 net.cpp:137] Memory required for data: 65519160
I0816 13:52:51.096469 19416 layer_factory.hpp:77] Creating layer relu5
I0816 13:52:51.096684 19416 net.cpp:84] Creating Layer relu5
I0816 13:52:51.096887 19416 net.cpp:406] relu5 <- conv5
I0816 13:52:51.097107 19416 net.cpp:367] relu5 -> conv5 (in-place)
I0816 13:52:51.097365 19416 net.cpp:122] Setting up relu5
I0816 13:52:51.097687 19416 net.cpp:129] Top shape: 10 256 13 13 (432640)
I0816 13:52:51.097892 19416 net.cpp:137] Memory required for data: 67249720
I0816 13:52:51.098141 19416 layer_factory.hpp:77] Creating layer pool5
I0816 13:52:51.098352 19416 net.cpp:84] Creating Layer pool5
I0816 13:52:51.098675 19416 net.cpp:406] pool5 <- conv5
I0816 13:52:51.098953 19416 net.cpp:380] pool5 -> pool5
I0816 13:52:51.099220 19416 net.cpp:122] Setting up pool5
I0816 13:52:51.099440 19416 net.cpp:129] Top shape: 10 256 6 6 (92160)
I0816 13:52:51.099681 19416 net.cpp:137] Memory required for data: 67618360
I0816 13:52:51.099925 19416 layer_factory.hpp:77] Creating layer fc6
I0816 13:52:51.100191 19416 net.cpp:84] Creating Layer fc6
I0816 13:52:51.100419 19416 net.cpp:406] fc6 <- pool5
I0816 13:52:51.100661 19416 net.cpp:380] fc6 -> fc6
I0816 13:52:51.316649 19416 net.cpp:122] Setting up fc6
I0816 13:52:51.317203 19416 net.cpp:129] Top shape: 10 4096 (40960)
I0816 13:52:51.317517 19416 net.cpp:137] Memory required for data: 67782200
I0816 13:52:51.317777 19416 layer_factory.hpp:77] Creating layer relu6
I0816 13:52:51.318035 19416 net.cpp:84] Creating Layer relu6
I0816 13:52:51.318311 19416 net.cpp:406] relu6 <- fc6
I0816 13:52:51.318555 19416 net.cpp:367] relu6 -> fc6 (in-place)
I0816 13:52:51.318776 19416 net.cpp:122] Setting up relu6
I0816 13:52:51.319022 19416 net.cpp:129] Top shape: 10 4096 (40960)
I0816 13:52:51.319268 19416 net.cpp:137] Memory required for data: 67946040
I0816 13:52:51.319495 19416 layer_factory.hpp:77] Creating layer drop6
I0816 13:52:51.319706 19416 net.cpp:84] Creating Layer drop6
I0816 13:52:51.319945 19416 net.cpp:406] drop6 <- fc6
I0816 13:52:51.320184 19416 net.cpp:367] drop6 -> fc6 (in-place)
I0816 13:52:51.320435 19416 net.cpp:122] Setting up drop6
I0816 13:52:51.320667 19416 net.cpp:129] Top shape: 10 4096 (40960)
I0816 13:52:51.320904 19416 net.cpp:137] Memory required for data: 68109880
I0816 13:52:51.321144 19416 layer_factory.hpp:77] Creating layer fc7
I0816 13:52:51.321352 19416 net.cpp:84] Creating Layer fc7
I0816 13:52:51.321542 19416 net.cpp:406] fc7 <- fc6
I0816 13:52:51.321748 19416 net.cpp:380] fc7 -> fc7
I0816 13:52:51.417714 19416 net.cpp:122] Setting up fc7
I0816 13:52:51.418320 19416 net.cpp:129] Top shape: 10 4096 (40960)
I0816 13:52:51.418643 19416 net.cpp:137] Memory required for data: 68273720
I0816 13:52:51.418906 19416 layer_factory.hpp:77] Creating layer relu7
I0816 13:52:51.419175 19416 net.cpp:84] Creating Layer relu7
I0816 13:52:51.419401 19416 net.cpp:406] relu7 <- fc7
I0816 13:52:51.419637 19416 net.cpp:367] relu7 -> fc7 (in-place)
I0816 13:52:51.419858 19416 net.cpp:122] Setting up relu7
I0816 13:52:51.420114 19416 net.cpp:129] Top shape: 10 4096 (40960)
I0816 13:52:51.420349 19416 net.cpp:137] Memory required for data: 68437560
I0816 13:52:51.420573 19416 layer_factory.hpp:77] Creating layer drop7
I0816 13:52:51.420816 19416 net.cpp:84] Creating Layer drop7
I0816 13:52:51.421072 19416 net.cpp:406] drop7 <- fc7
I0816 13:52:51.421314 19416 net.cpp:367] drop7 -> fc7 (in-place)
I0816 13:52:51.421576 19416 net.cpp:122] Setting up drop7
I0816 13:52:51.421797 19416 net.cpp:129] Top shape: 10 4096 (40960)
I0816 13:52:51.422005 19416 net.cpp:137] Memory required for data: 68601400
I0816 13:52:51.422209 19416 layer_factory.hpp:77] Creating layer fc8
I0816 13:52:51.422412 19416 net.cpp:84] Creating Layer fc8
I0816 13:52:51.422693 19416 net.cpp:406] fc8 <- fc7
I0816 13:52:51.422952 19416 net.cpp:380] fc8 -> fc8
I0816 13:52:51.446458 19416 net.cpp:122] Setting up fc8
I0816 13:52:51.447000 19416 net.cpp:129] Top shape: 10 1000 (10000)
I0816 13:52:51.447319 19416 net.cpp:137] Memory required for data: 68641400
I0816 13:52:51.447587 19416 layer_factory.hpp:77] Creating layer prob
I0816 13:52:51.447831 19416 net.cpp:84] Creating Layer prob
I0816 13:52:51.448105 19416 net.cpp:406] prob <- fc8
I0816 13:52:51.448367 19416 net.cpp:380] prob -> prob
I0816 13:52:51.448642 19416 net.cpp:122] Setting up prob
I0816 13:52:51.448843 19416 net.cpp:129] Top shape: 10 1000 (10000)
I0816 13:52:51.449128 19416 net.cpp:137] Memory required for data: 68681400
I0816 13:52:51.449360 19416 net.cpp:200] prob does not need backward computation.
I0816 13:52:51.449563 19416 net.cpp:200] fc8 does not need backward computation.
I0816 13:52:51.449812 19416 net.cpp:200] drop7 does not need backward computation.
I0816 13:52:51.450067 19416 net.cpp:200] relu7 does not need backward computation.
I0816 13:52:51.450289 19416 net.cpp:200] fc7 does not need backward computation.
I0816 13:52:51.450520 19416 net.cpp:200] drop6 does not need backward computation.
I0816 13:52:51.450747 19416 net.cpp:200] relu6 does not need backward computation.
I0816 13:52:51.450989 19416 net.cpp:200] fc6 does not need backward computation.
I0816 13:52:51.451185 19416 net.cpp:200] pool5 does not need backward computation.
I0816 13:52:51.451395 19416 net.cpp:200] relu5 does not need backward computation.
I0816 13:52:51.451592 19416 net.cpp:200] conv5 does not need backward computation.
I0816 13:52:51.451786 19416 net.cpp:200] relu4 does not need backward computation.
I0816 13:52:51.451992 19416 net.cpp:200] conv4 does not need backward computation.
I0816 13:52:51.452188 19416 net.cpp:200] relu3 does not need backward computation.
I0816 13:52:51.452415 19416 net.cpp:200] conv3 does not need backward computation.
I0816 13:52:51.452682 19416 net.cpp:200] norm2 does not need backward computation.
I0816 13:52:51.452994 19416 net.cpp:200] pool2 does not need backward computation.
I0816 13:52:51.453312 19416 net.cpp:200] relu2 does not need backward computation.
I0816 13:52:51.453542 19416 net.cpp:200] conv2 does not need backward computation.
I0816 13:52:51.453744 19416 net.cpp:200] norm1 does not need backward computation.
I0816 13:52:51.454011 19416 net.cpp:200] pool1 does not need backward computation.
I0816 13:52:51.454252 19416 net.cpp:200] relu1 does not need backward computation.
I0816 13:52:51.454478 19416 net.cpp:200] conv1 does not need backward computation.
I0816 13:52:51.454692 19416 net.cpp:200] data does not need backward computation.
I0816 13:52:51.454900 19416 net.cpp:242] This network produces output prob
I0816 13:52:51.455215 19416 net.cpp:255] Network initialization done.
I0816 13:53:13.421692 19416 upgrade_proto.cpp:46] Attempting to upgrade input file specified using deprecated transformation parameters: ../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel
I0816 13:53:13.421979 19416 upgrade_proto.cpp:49] Successfully upgraded file specified using deprecated data transformation parameters.
W0816 13:53:13.422032 19416 upgrade_proto.cpp:51] Note that future Caffe releases will only support transform_param messages for transformation fields.
I0816 13:53:13.422080 19416 upgrade_proto.cpp:55] Attempting to upgrade input file specified using deprecated V1LayerParameter: ../models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel
I0816 13:53:28.815574 19416 upgrade_proto.cpp:63] Successfully upgraded file specified using deprecated V1LayerParameter
I0816 13:53:35.435127 19416 net.cpp:744] Ignoring source layer loss
Loading file: airplane.jpg
Classifying 1 inputs.
Done in 14.71 s.
Saving results into result.txt


● Deep Learningによる物体認識や物体検出で良く出てくるサイト

 The CIFAR-10 dataset
The CIFAR-10 dataset

cifar-100-python.tar.gz
https://web.archive.org/web/20160526193113/http://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz

 80 million tiny images
Visual Dictionary Teaching computers to recognize objects

 COMPUTATIONAL VISION AT CALTECH
Caltech 101 - Computational Vision

Caltech 256 - Computational Vision



Tags: [Raspberry Pi], [電子工作], [ディープラーニング]

●関連するコンテンツ(この記事を読んだ人は、次の記事も読んでいます)

Raspberry Piでメモリを馬鹿食いするアプリ用に不要なサービスを停止してフリーメモリを増やす方法
Raspberry Piでメモリを馬鹿食いするアプリ用に不要なサービスを停止してフリーメモリを増やす方法

  ラズパイでメモリを沢山使用するビルドやアプリ用に不要なサービス等を停止して使えるメインメモリを増やす

【成功版】最新版の Darknetに digitalbrain79版の Darknet with NNPACKの NNPACK処理を適用する
【成功版】最新版の Darknetに digitalbrain79版の Darknet with NNPACKの NNPACK処理を適用する

  ラズパイで NNPACK対応の最新版の Darknetを動かして超高速で物体検出や DeepDreamの悪夢を見る

【成功版】Raspberry Piで NNPACK対応版の Darknet Neural Network Frameworkをビルドする方法
【成功版】Raspberry Piで NNPACK対応版の Darknet Neural Network Frameworkをビルドする方法

  ラズパイに Darknet NNPACK darknet-nnpackをソースからビルドして物体検出を行なう方法

【成功版】Raspberry Piで Darknet Neural Network Frameworkをビルドする方法
【成功版】Raspberry Piで Darknet Neural Network Frameworkをビルドする方法

  ラズパイに Darknet Neural Network Frameworkを入れて物体検出や悪夢のグロ画像を生成する

【成功版】Raspberry Piに TensorFlow Deep Learning Frameworkをインストールする方法
【成功版】Raspberry Piに TensorFlow Deep Learning Frameworkをインストールする方法

  ラズパイに TensorFlow Deep Learning Frameworkを入れて Google DeepDreamで悪夢を見る方法

Raspberry Piで TensorFlow Deep Learning Frameworkを自己ビルドする方法
Raspberry Piで TensorFlow Deep Learning Frameworkを自己ビルドする方法

  ラズパイで TensorFlow Deep Learning Frameworkを自己ビルドする方法

【ビルド版】Raspberry Piで DeepDreamを動かしてキモイ絵をモリモリ量産 Caffe Deep Learning Framework
【ビルド版】Raspberry Piで DeepDreamを動かしてキモイ絵をモリモリ量産 Caffe Deep Learning Framework

  ラズパイで Caffe Deep Learning Frameworkをビルドして Deep Dreamを動かしてキモイ絵を生成する

【インストール版】Raspberry Piで DeepDreamを動かしてキモイ絵をモリモリ量産 Caffe Deep Learning
【インストール版】Raspberry Piで DeepDreamを動かしてキモイ絵をモリモリ量産 Caffe Deep Learning

  ラズパイで Caffe Deep Learning Frameworkをインストールして Deep Dreamを動かしてキモイ絵を生成する

Raspberry Piで Caffe2 Deep Learning Frameworkをソースコードからビルドする方法
Raspberry Piで Caffe2 Deep Learning Frameworkをソースコードからビルドする方法

  ラズパイで Caffe 2 Deep Learning Frameworkをソースコードから自己ビルドする方法

Orange Pi PC 2の 64bitのチカラで DeepDreamしてキモイ絵を高速でモリモリ量産してみるテスト
Orange Pi PC 2の 64bitのチカラで DeepDreamしてキモイ絵を高速でモリモリ量産してみるテスト

  OrangePi PC2に Caffe Deep Learning Frameworkをビルドして Deep Dreamを動かしてキモイ絵を生成する

Raspberry Piに Jupyter Notebookをインストールして拡張子 ipynb形式の IPythonを動かす
Raspberry Piに Jupyter Notebookをインストールして拡張子 ipynb形式の IPythonを動かす

  ラズパイに IPython Notebookをインストールして Google DeepDream dream.ipynbを動かす

Raspberry Piで Deep Learningフレームワーク Chainerをインストールしてみる
Raspberry Piで Deep Learningフレームワーク Chainerをインストールしてみる

  ラズパイに Deep Learningのフレームワーク Chainerを入れてみた

Raspberry Piで DeepBeliefSDKをビルドして画像認識フレームワークを動かす方法
Raspberry Piで DeepBeliefSDKをビルドして画像認識フレームワークを動かす方法

  ラズパイに DeepBeliefSDKを入れて画像の物体認識を行なう

Raspberry Piで Microsoftの ELLをビルドする方法
Raspberry Piで Microsoftの ELLをビルドする方法

  ラズパイで Microsoftの ELL Embedded Learning Libraryをビルドしてみるテスト、ビルドするだけ

Raspberry Piで MXNet port of SSD Single Shot MultiBoxを動かして画像の物体検出をする方法
Raspberry Piで MXNet port of SSD Single Shot MultiBoxを動かして画像の物体検出をする方法

  ラズパイで MXNet port of SSD Single Shot MultiBox Object Detectorで物体検出を行なってみる

Raspberry Piで Apache MXNet Incubatingをビルドする方法
Raspberry Piで Apache MXNet Incubatingをビルドする方法

  ラズパイで Apache MXNet Incubatingをビルドしてみるテスト、ビルドするだけ

Raspberry Piで OpenCVの Haar Cascade Object Detectionでリアルタイムにカメラ映像の顔検出を行なってみる
Raspberry Piで OpenCVの Haar Cascade Object Detectionでリアルタイムにカメラ映像の顔検出を行なってみる

  ラズパイで OpenCVの Haar Cascade Object Detection Face & Eyeでリアルタイムでカメラ映像の顔検出をする方法

Raspberry Piで NNPACKをビルドする方法
Raspberry Piで NNPACKをビルドする方法

  ラズパイで NNPACKをビルドしてみるテスト、ビルドするだけ

Raspberry Pi 3の Linuxコンソール上で使用する各種コマンドまとめ
Raspberry Pi 3の Linuxコンソール上で使用する各種コマンドまとめ

  ラズパイの Raspbian OSのコマンドラインで使用する便利コマンド、負荷試験や CPUシリアル番号の確認方法等も



[HOME] | [BACK]
リンクフリー(連絡不要、ただしトップページ以外はweb構成の変更で移動する場合があります)
Copyright (c) 2018 FREE WING, Y.Sakamoto
Powered by 猫屋敷工房 & HTML Generator

http://www.neko.ne.jp/~freewing/raspberry_pi/raspberry_pi_caffe_deep_learning_recognize_object/