首先安装好显卡----已经装好了?普大喜奔!没装好?那就用cpu,也是一样的。
拷贝cudnn v5.0 头文件和库文件以及执行文件到cuda8中
-----------------------------准备工作--------------------------------------
git clone https://github.com/BVLC/caffe.git
git branch -a
git checkout windows
cmake-gui
configure + vs2013 Win64
修改设置atlas选项为 open
build-malab on
检查numpy是否安装,是否配置正确
configure
generate
----------------------------build---------------------------------------
choose release mode
build all
reload and build all
build-install
set to path
-----------------------------wget----------------------------------------------
wget --no-check-certificate http://www.cs.toronto.edu/~kriz/cifar-10-binary.tar.gz
./get_cifar10.sh (只执行解压缩那部分的语句,其余的不要。)即:
tar -xvf cifar-10-binary.tar.gz && rm -f cifar-10-binary.tar.gz
mv cifar-10-batches-bin/* . && rm -rf cifar-10-batches-bin
(放在了example目录下了)
====================================华丽的分割线--摘要===========================================
convert_cifar_data.exe data/cifar10 examples/cifar10 lmdb
compute_image_mean.exe -backend=lmdb examples/cifar10/cifar10_train_lmdb examples/cifar10/mean.binaryproto
caffe train --solver=examples/cifar10/cifar10_quick_solver.prototxt
caffe train --solver=examples/cifar10/cifar10_quick_solver_lr1.prototxt --snapshot=examples/cifar10/cifar10_quick_iter_4000.solverstate
caffe test -model examples/cifar10/cifar10_quick_train_test.prototxt -weights examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5 -iterations 100
---------------------------------------
synset_words.txt 的内容如下:
- airplane
- automobile
- bird
- cat
- deer
- dog
- frog
- horse
- ship
- truck
===================================超级华丽的分割线1============================================
===================================超级华丽的分割线2============================================
===================================超级华丽的分割线3============================================
===================================超级华丽的分割线4============================================
======================================= 详情 ===============================================
convert_cifar_data.exe data/cifar10 examples/cifar10 lmdb
compute_image_mean.exe -backend=lmdb examples/cifar10/cifar10_train_lmdb examples/cifar10/mean.binaryproto
caffe.exe train --solver=examples/cifar10/cifar10_quick_solver.prototxt
I0703 15:08:42.303948 73968 net.cpp:137] Memory required for data: 1230000
I0703 15:08:42.303948 73968 layer_factory.cpp:58] Creating layer conv1
I0703 15:08:42.303948 73968 net.cpp:84] Creating Layer conv1
I0703 15:08:42.303948 73968 net.cpp:406] conv1 <- data
I0703 15:08:42.303948 73968 net.cpp:380] conv1 -> conv1
I0703 15:08:42.303948 73968 net.cpp:122] Setting up conv1
I0703 15:08:42.303948 73968 net.cpp:129] Top shape: 100 32 32 32 (3276800)
I0703 15:08:42.319548 73968 net.cpp:137] Memory required for data: 14337200
I0703 15:08:42.319548 73968 layer_factory.cpp:58] Creating layer pool1
I0703 15:08:42.319548 73968 net.cpp:84] Creating Layer pool1
I0703 15:08:42.319548 73968 net.cpp:406] pool1 <- conv1
I0703 15:08:42.319548 73968 net.cpp:380] pool1 -> pool1
I0703 15:08:42.319548 73968 net.cpp:122] Setting up pool1
I0703 15:08:42.319548 73968 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:08:42.319548 73968 net.cpp:137] Memory required for data: 17614000
I0703 15:08:42.319548 73968 layer_factory.cpp:58] Creating layer relu1
I0703 15:08:42.319548 73968 net.cpp:84] Creating Layer relu1
I0703 15:08:42.319548 73968 net.cpp:406] relu1 <- pool1
I0703 15:08:42.319548 73968 net.cpp:367] relu1 -> pool1 (in-place)
I0703 15:08:42.319548 73968 net.cpp:122] Setting up relu1
I0703 15:08:42.319548 73968 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:08:42.319548 73968 net.cpp:137] Memory required for data: 20890800
I0703 15:08:42.319548 73968 layer_factory.cpp:58] Creating layer conv2
I0703 15:08:42.319548 73968 net.cpp:84] Creating Layer conv2
I0703 15:08:42.319548 73968 net.cpp:406] conv2 <- pool1
I0703 15:08:42.319548 73968 net.cpp:380] conv2 -> conv2
I0703 15:08:42.319548 73968 net.cpp:122] Setting up conv2
I0703 15:08:42.319548 73968 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:08:42.319548 73968 net.cpp:137] Memory required for data: 24167600
I0703 15:08:42.319548 73968 layer_factory.cpp:58] Creating layer relu2
I0703 15:08:42.319548 73968 net.cpp:84] Creating Layer relu2
I0703 15:08:42.319548 73968 net.cpp:406] relu2 <- conv2
I0703 15:08:42.319548 73968 net.cpp:367] relu2 -> conv2 (in-place)
I0703 15:08:42.335150 73968 net.cpp:122] Setting up relu2
I0703 15:08:42.335150 73968 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:08:42.335150 73968 net.cpp:137] Memory required for data: 27444400
I0703 15:08:42.335150 73968 layer_factory.cpp:58] Creating layer pool2
I0703 15:08:42.335150 73968 net.cpp:84] Creating Layer pool2
I0703 15:08:42.335150 73968 net.cpp:406] pool2 <- conv2
I0703 15:08:42.335150 73968 net.cpp:380] pool2 -> pool2
I0703 15:08:42.335150 73968 net.cpp:122] Setting up pool2
I0703 15:08:42.335150 73968 net.cpp:129] Top shape: 100 32 8 8 (204800)
I0703 15:08:42.335150 73968 net.cpp:137] Memory required for data: 28263600
I0703 15:08:42.335150 73968 layer_factory.cpp:58] Creating layer conv3
I0703 15:08:42.335150 73968 net.cpp:84] Creating Layer conv3
I0703 15:08:42.335150 73968 net.cpp:406] conv3 <- pool2
I0703 15:08:42.335150 73968 net.cpp:380] conv3 -> conv3
I0703 15:08:42.335150 73968 net.cpp:122] Setting up conv3
I0703 15:08:42.335150 73968 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0703 15:08:42.335150 73968 net.cpp:137] Memory required for data: 29902000
I0703 15:08:42.335150 73968 layer_factory.cpp:58] Creating layer relu3
I0703 15:08:42.335150 73968 net.cpp:84] Creating Layer relu3
I0703 15:08:42.335150 73968 net.cpp:406] relu3 <- conv3
I0703 15:08:42.335150 73968 net.cpp:367] relu3 -> conv3 (in-place)
I0703 15:08:42.335150 73968 net.cpp:122] Setting up relu3
I0703 15:08:42.335150 73968 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0703 15:08:42.335150 73968 net.cpp:137] Memory required for data: 31540400
I0703 15:08:42.335150 73968 layer_factory.cpp:58] Creating layer pool3
I0703 15:08:42.350749 73968 net.cpp:84] Creating Layer pool3
I0703 15:08:42.350749 73968 net.cpp:406] pool3 <- conv3
I0703 15:08:42.350749 73968 net.cpp:380] pool3 -> pool3
I0703 15:08:42.350749 73968 net.cpp:122] Setting up pool3
I0703 15:08:42.350749 73968 net.cpp:129] Top shape: 100 64 4 4 (102400)
I0703 15:08:42.350749 73968 net.cpp:137] Memory required for data: 31950000
I0703 15:08:42.350749 73968 layer_factory.cpp:58] Creating layer ip1
I0703 15:08:42.350749 73968 net.cpp:84] Creating Layer ip1
I0703 15:08:42.350749 73968 net.cpp:406] ip1 <- pool3
I0703 15:08:42.350749 73968 net.cpp:380] ip1 -> ip1
I0703 15:08:42.350749 73968 net.cpp:122] Setting up ip1
I0703 15:08:42.350749 73968 net.cpp:129] Top shape: 100 64 (6400)
I0703 15:08:42.350749 73968 net.cpp:137] Memory required for data: 31975600
I0703 15:08:42.350749 73968 layer_factory.cpp:58] Creating layer ip2
I0703 15:08:42.350749 73968 net.cpp:84] Creating Layer ip2
I0703 15:08:42.350749 73968 net.cpp:406] ip2 <- ip1
I0703 15:08:42.350749 73968 net.cpp:380] ip2 -> ip2
I0703 15:08:42.350749 73968 net.cpp:122] Setting up ip2
I0703 15:08:42.350749 73968 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:08:42.350749 73968 net.cpp:137] Memory required for data: 31979600
I0703 15:08:42.350749 73968 layer_factory.cpp:58] Creating layer ip2_ip2_0_split
I0703 15:08:42.350749 73968 net.cpp:84] Creating Layer ip2_ip2_0_split
I0703 15:08:42.350749 73968 net.cpp:406] ip2_ip2_0_split <- ip2
I0703 15:08:42.350749 73968 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_0
I0703 15:08:42.350749 73968 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_1
I0703 15:08:42.350749 73968 net.cpp:122] Setting up ip2_ip2_0_split
I0703 15:08:42.350749 73968 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:08:42.350749 73968 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:08:42.350749 73968 net.cpp:137] Memory required for data: 31987600
I0703 15:08:42.350749 73968 layer_factory.cpp:58] Creating layer accuracy
I0703 15:08:42.350749 73968 net.cpp:84] Creating Layer accuracy
I0703 15:08:42.350749 73968 net.cpp:406] accuracy <- ip2_ip2_0_split_0
I0703 15:08:42.350749 73968 net.cpp:406] accuracy <- label_cifar_1_split_0
I0703 15:08:42.366350 73968 net.cpp:380] accuracy -> accuracy
I0703 15:08:42.366350 73968 net.cpp:122] Setting up accuracy
I0703 15:08:42.366350 73968 net.cpp:129] Top shape: (1)
I0703 15:08:42.366350 73968 net.cpp:137] Memory required for data: 31987604
I0703 15:08:42.366350 73968 layer_factory.cpp:58] Creating layer loss
I0703 15:08:42.366350 73968 net.cpp:84] Creating Layer loss
I0703 15:08:42.366350 73968 net.cpp:406] loss <- ip2_ip2_0_split_1
I0703 15:08:42.366350 73968 net.cpp:406] loss <- label_cifar_1_split_1
I0703 15:08:42.366350 73968 net.cpp:380] loss -> loss
I0703 15:08:42.366350 73968 layer_factory.cpp:58] Creating layer loss
I0703 15:08:42.366350 73968 net.cpp:122] Setting up loss
I0703 15:08:42.366350 73968 net.cpp:129] Top shape: (1)
I0703 15:08:42.366350 73968 net.cpp:132] with loss weight 1
I0703 15:08:42.366350 73968 net.cpp:137] Memory required for data: 31987608
I0703 15:08:42.366350 73968 net.cpp:198] loss needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:200] accuracy does not need backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] ip2_ip2_0_split needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] ip2 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] ip1 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] pool3 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] relu3 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] conv3 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] pool2 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] relu2 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] conv2 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] relu1 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] pool1 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] conv1 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:200] label_cifar_1_split does not need backward computation.
I0703 15:08:42.366350 73968 net.cpp:200] cifar does not need backward computation.
I0703 15:08:42.366350 73968 net.cpp:242] This network produces output accuracy
I0703 15:08:42.366350 73968 net.cpp:242] This network produces output loss
I0703 15:08:42.366350 73968 net.cpp:255] Network initialization done.
I0703 15:08:42.366350 73968 solver.cpp:56] Solver scaffolding done.
I0703 15:08:42.366350 73968 caffe.cpp:249] Starting Optimization
I0703 15:08:42.366350 73968 solver.cpp:272] Solving CIFAR10_quick
I0703 15:08:42.366350 73968 solver.cpp:273] Learning Rate Policy: fixed
I0703 15:08:42.413151 73968 solver.cpp:330] Iteration 0, Testing net (#0)
I0703 15:08:43.629990 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:08:43.661191 73968 solver.cpp:397] Test net output #0: accuracy = 0.0986
I0703 15:08:43.661191 73968 solver.cpp:397] Test net output #1: loss = 2.30244 (* 1 = 2.30244 loss)
I0703 15:08:43.707993 73968 solver.cpp:218] Iteration 0 (-5.42863e-042 iter/s, 1.29273s/100 iters), loss = 2.30274
I0703 15:08:43.707993 73968 solver.cpp:237] Train net output #0: loss = 2.30274 (* 1 = 2.30274 loss)
I0703 15:08:43.707993 73968 sgd_solver.cpp:105] Iteration 0, lr = 0.001
I0703 15:08:46.640887 73968 solver.cpp:218] Iteration 100 (34.1913 iter/s, 2.92472s/100 iters), loss = 1.65303
I0703 15:08:46.640887 73968 solver.cpp:237] Train net output #0: loss = 1.65303 (* 1 = 1.65303 loss)
I0703 15:08:46.640887 73968 sgd_solver.cpp:105] Iteration 100, lr = 0.001
I0703 15:08:49.563181 73968 solver.cpp:218] Iteration 200 (34.1799 iter/s, 2.9257s/100 iters), loss = 1.60865
I0703 15:08:49.563181 73968 solver.cpp:237] Train net output #0: loss = 1.60865 (* 1 = 1.60865 loss)
I0703 15:08:49.563181 73968 sgd_solver.cpp:105] Iteration 200, lr = 0.001
I0703 15:08:52.480474 73968 solver.cpp:218] Iteration 300 (34.2469 iter/s, 2.91998s/100 iters), loss = 1.185
I0703 15:08:52.480474 73968 solver.cpp:237] Train net output #0: loss = 1.185 (* 1 = 1.185 loss)
I0703 15:08:52.480474 73968 sgd_solver.cpp:105] Iteration 300, lr = 0.001
I0703 15:08:55.428969 73968 solver.cpp:218] Iteration 400 (34.0078 iter/s, 2.9405s/100 iters), loss = 1.22841
I0703 15:08:55.428969 73968 solver.cpp:237] Train net output #0: loss = 1.22841 (* 1 = 1.22841 loss)
I0703 15:08:55.428969 73968 sgd_solver.cpp:105] Iteration 400, lr = 0.001
I0703 15:08:58.253659 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:08:58.347262 73968 solver.cpp:330] Iteration 500, Testing net (#0)
I0703 15:08:59.486099 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:08:59.517300 73968 solver.cpp:397] Test net output #0: accuracy = 0.5535
I0703 15:08:59.517300 73968 solver.cpp:397] Test net output #1: loss = 1.2742 (* 1 = 1.2742 loss)
I0703 15:08:59.548501 73968 solver.cpp:218] Iteration 500 (24.261 iter/s, 4.12185s/100 iters), loss = 1.23252
I0703 15:08:59.548501 73968 solver.cpp:237] Train net output #0: loss = 1.23252 (* 1 = 1.23252 loss)
I0703 15:08:59.548501 73968 sgd_solver.cpp:105] Iteration 500, lr = 0.001
I0703 15:09:02.512596 73968 solver.cpp:218] Iteration 600 (33.7982 iter/s, 2.95873s/100 iters), loss = 1.25093
I0703 15:09:02.512596 73968 solver.cpp:237] Train net output #0: loss = 1.25093 (* 1 = 1.25093 loss)
I0703 15:09:02.512596 73968 sgd_solver.cpp:105] Iteration 600, lr = 0.001
I0703 15:09:05.493698 73968 solver.cpp:218] Iteration 700 (33.3689 iter/s, 2.99681s/100 iters), loss = 1.15823
I0703 15:09:05.494699 73968 solver.cpp:237] Train net output #0: loss = 1.15823 (* 1 = 1.15823 loss)
I0703 15:09:05.494699 73968 sgd_solver.cpp:105] Iteration 700, lr = 0.001
I0703 15:09:08.483223 73968 solver.cpp:218] Iteration 800 (33.6528 iter/s, 2.97152s/100 iters), loss = 1.02646
I0703 15:09:08.483223 73968 solver.cpp:237] Train net output #0: loss = 1.02646 (* 1 = 1.02646 loss)
I0703 15:09:08.483223 73968 sgd_solver.cpp:105] Iteration 800, lr = 0.001
I0703 15:09:11.447319 73968 solver.cpp:218] Iteration 900 (33.8917 iter/s, 2.95058s/100 iters), loss = 1.09516
I0703 15:09:11.447319 73968 solver.cpp:237] Train net output #0: loss = 1.09516 (* 1 = 1.09516 loss)
I0703 15:09:11.447319 73968 sgd_solver.cpp:105] Iteration 900, lr = 0.001
I0703 15:09:14.285416 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:09:14.394620 73968 solver.cpp:330] Iteration 1000, Testing net (#0)
I0703 15:09:15.551057 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:09:15.597858 73968 solver.cpp:397] Test net output #0: accuracy = 0.6325
I0703 15:09:15.597858 73968 solver.cpp:397] Test net output #1: loss = 1.05842 (* 1 = 1.05842 loss)
I0703 15:09:15.629060 73968 solver.cpp:218] Iteration 1000 (23.9025 iter/s, 4.18367s/100 iters), loss = 0.983432
I0703 15:09:15.629060 73968 solver.cpp:237] Train net output #0: loss = 0.983432 (* 1 = 0.983432 loss)
I0703 15:09:15.629060 73968 sgd_solver.cpp:105] Iteration 1000, lr = 0.001
I0703 15:09:18.599957 73968 solver.cpp:218] Iteration 1100 (33.8006 iter/s, 2.95853s/100 iters), loss = 1.06141
I0703 15:09:18.599957 73968 solver.cpp:237] Train net output #0: loss = 1.06141 (* 1 = 1.06141 loss)
I0703 15:09:18.599957 73968 sgd_solver.cpp:105] Iteration 1100, lr = 0.001
I0703 15:09:19.317580 73968 blocking_queue.cpp:49] Waiting for data
I0703 15:09:21.548452 73968 solver.cpp:218] Iteration 1200 (33.9381 iter/s, 2.94654s/100 iters), loss = 0.950789
I0703 15:09:21.548452 73968 solver.cpp:237] Train net output #0: loss = 0.950789 (* 1 = 0.950789 loss)
I0703 15:09:21.548452 73968 sgd_solver.cpp:105] Iteration 1200, lr = 0.001
I0703 15:09:24.512547 73968 solver.cpp:218] Iteration 1300 (33.7456 iter/s, 2.96335s/100 iters), loss = 0.845029
I0703 15:09:24.512547 73968 solver.cpp:237] Train net output #0: loss = 0.845029 (* 1 = 0.845029 loss)
I0703 15:09:24.512547 73968 sgd_solver.cpp:105] Iteration 1300, lr = 0.001
I0703 15:09:27.538540 73968 solver.cpp:218] Iteration 1400 (33.0713 iter/s, 3.02377s/100 iters), loss = 0.854708
I0703 15:09:27.538540 73968 solver.cpp:237] Train net output #0: loss = 0.854708 (* 1 = 0.854708 loss)
I0703 15:09:27.538540 73968 sgd_solver.cpp:105] Iteration 1400, lr = 0.001
I0703 15:09:30.405486 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:09:30.499089 73968 solver.cpp:330] Iteration 1500, Testing net (#0)
I0703 15:09:31.642729 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:09:31.689529 73968 solver.cpp:397] Test net output #0: accuracy = 0.666
I0703 15:09:31.689529 73968 solver.cpp:397] Test net output #1: loss = 0.965108 (* 1 = 0.965108 loss)
I0703 15:09:31.706130 73968 solver.cpp:218] Iteration 1500 (23.9048 iter/s, 4.18326s/100 iters), loss = 0.803068
I0703 15:09:31.706130 73968 solver.cpp:237] Train net output #0: loss = 0.803068 (* 1 = 0.803068 loss)
I0703 15:09:31.706130 73968 sgd_solver.cpp:105] Iteration 1500, lr = 0.001
I0703 15:09:34.671032 73968 solver.cpp:218] Iteration 1600 (33.9581 iter/s, 2.94481s/100 iters), loss = 0.89545
I0703 15:09:34.671032 73968 solver.cpp:237] Train net output #0: loss = 0.89545 (* 1 = 0.89545 loss)
I0703 15:09:34.671032 73968 sgd_solver.cpp:105] Iteration 1600, lr = 0.001
I0703 15:09:37.666788 73968 solver.cpp:218] Iteration 1700 (33.1725 iter/s, 3.01455s/100 iters), loss = 0.858076
I0703 15:09:37.667788 73968 solver.cpp:237] Train net output #0: loss = 0.858076 (* 1 = 0.858076 loss)
I0703 15:09:37.667788 73968 sgd_solver.cpp:105] Iteration 1700, lr = 0.001
I0703 15:09:40.681776 73968 solver.cpp:218] Iteration 1800 (33.4777 iter/s, 2.98707s/100 iters), loss = 0.739417
I0703 15:09:40.681776 73968 solver.cpp:237] Train net output #0: loss = 0.739417 (* 1 = 0.739417 loss)
I0703 15:09:40.681776 73968 sgd_solver.cpp:105] Iteration 1800, lr = 0.001
I0703 15:09:43.649132 73968 solver.cpp:218] Iteration 1900 (33.6842 iter/s, 2.96875s/100 iters), loss = 0.755557
I0703 15:09:43.649132 73968 solver.cpp:237] Train net output #0: loss = 0.755557 (* 1 = 0.755557 loss)
I0703 15:09:43.649132 73968 sgd_solver.cpp:105] Iteration 1900, lr = 0.001
I0703 15:09:46.534909 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:09:46.628511 73968 solver.cpp:330] Iteration 2000, Testing net (#0)
I0703 15:09:47.772374 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:09:47.819176 73968 solver.cpp:397] Test net output #0: accuracy = 0.7016
I0703 15:09:47.819176 73968 solver.cpp:397] Test net output #1: loss = 0.878213 (* 1 = 0.878213 loss)
I0703 15:09:47.850378 73968 solver.cpp:218] Iteration 2000 (23.8369 iter/s, 4.19518s/100 iters), loss = 0.678401
I0703 15:09:47.850378 73968 solver.cpp:237] Train net output #0: loss = 0.678401 (* 1 = 0.678401 loss)
I0703 15:09:47.850378 73968 sgd_solver.cpp:105] Iteration 2000, lr = 0.001
I0703 15:09:50.791185 73968 solver.cpp:218] Iteration 2100 (33.7445 iter/s, 2.96345s/100 iters), loss = 0.790343
I0703 15:09:50.792186 73968 solver.cpp:237] Train net output #0: loss = 0.790343 (* 1 = 0.790343 loss)
I0703 15:09:50.792186 73968 sgd_solver.cpp:105] Iteration 2100, lr = 0.001
I0703 15:09:53.723479 73968 solver.cpp:218] Iteration 2200 (34.139 iter/s, 2.9292s/100 iters), loss = 0.792828
I0703 15:09:53.723479 73968 solver.cpp:237] Train net output #0: loss = 0.792828 (* 1 = 0.792828 loss)
I0703 15:09:53.723479 73968 sgd_solver.cpp:105] Iteration 2200, lr = 0.001
I0703 15:09:56.723778 73968 solver.cpp:218] Iteration 2300 (33.3467 iter/s, 2.9988s/100 iters), loss = 0.634268
I0703 15:09:56.724778 73968 solver.cpp:237] Train net output #0: loss = 0.634268 (* 1 = 0.634268 loss)
I0703 15:09:56.724778 73968 sgd_solver.cpp:105] Iteration 2300, lr = 0.001
I0703 15:09:59.751081 73968 solver.cpp:218] Iteration 2400 (33.0722 iter/s, 3.02369s/100 iters), loss = 0.730874
I0703 15:09:59.752081 73968 solver.cpp:237] Train net output #0: loss = 0.730874 (* 1 = 0.730874 loss)
I0703 15:09:59.752081 73968 sgd_solver.cpp:105] Iteration 2400, lr = 0.001
I0703 15:10:02.642370 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:02.746381 73968 solver.cpp:330] Iteration 2500, Testing net (#0)
I0703 15:10:03.921499 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:03.966502 73968 solver.cpp:397] Test net output #0: accuracy = 0.7126
I0703 15:10:03.966502 73968 solver.cpp:397] Test net output #1: loss = 0.845339 (* 1 = 0.845339 loss)
I0703 15:10:03.996505 73968 solver.cpp:218] Iteration 2500 (23.57 iter/s, 4.24269s/100 iters), loss = 0.613872
I0703 15:10:03.997506 73968 solver.cpp:237] Train net output #0: loss = 0.613872 (* 1 = 0.613872 loss)
I0703 15:10:03.997506 73968 sgd_solver.cpp:105] Iteration 2500, lr = 0.001
I0703 15:10:06.947801 73968 solver.cpp:218] Iteration 2600 (33.9124 iter/s, 2.94878s/100 iters), loss = 0.713529
I0703 15:10:06.948801 73968 solver.cpp:237] Train net output #0: loss = 0.713529 (* 1 = 0.713529 loss)
I0703 15:10:06.949801 73968 sgd_solver.cpp:105] Iteration 2600, lr = 0.001
I0703 15:10:09.897095 73968 solver.cpp:218] Iteration 2700 (33.94 iter/s, 2.94638s/100 iters), loss = 0.749147
I0703 15:10:09.898097 73968 solver.cpp:237] Train net output #0: loss = 0.749147 (* 1 = 0.749147 loss)
I0703 15:10:09.898097 73968 sgd_solver.cpp:105] Iteration 2700, lr = 0.001
I0703 15:10:12.863761 73968 solver.cpp:218] Iteration 2800 (33.7369 iter/s, 2.96411s/100 iters), loss = 0.570446
I0703 15:10:12.864761 73968 solver.cpp:237] Train net output #0: loss = 0.570446 (* 1 = 0.570446 loss)
I0703 15:10:12.865762 73968 sgd_solver.cpp:105] Iteration 2800, lr = 0.001
I0703 15:10:15.809679 73968 solver.cpp:218] Iteration 2900 (33.9821 iter/s, 2.94273s/100 iters), loss = 0.713307
I0703 15:10:15.810678 73968 solver.cpp:237] Train net output #0: loss = 0.713307 (* 1 = 0.713307 loss)
I0703 15:10:15.811678 73968 sgd_solver.cpp:105] Iteration 2900, lr = 0.001
I0703 15:10:18.606696 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:18.704706 73968 solver.cpp:330] Iteration 3000, Testing net (#0)
I0703 15:10:19.830818 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:19.873823 73968 solver.cpp:397] Test net output #0: accuracy = 0.7197
I0703 15:10:19.874824 73968 solver.cpp:397] Test net output #1: loss = 0.835526 (* 1 = 0.835526 loss)
I0703 15:10:19.903826 73968 solver.cpp:218] Iteration 3000 (24.4437 iter/s, 4.09103s/100 iters), loss = 0.57974
I0703 15:10:19.904826 73968 solver.cpp:237] Train net output #0: loss = 0.57974 (* 1 = 0.57974 loss)
I0703 15:10:19.905827 73968 sgd_solver.cpp:105] Iteration 3000, lr = 0.001
I0703 15:10:22.878123 73968 solver.cpp:218] Iteration 3100 (33.657 iter/s, 2.97115s/100 iters), loss = 0.664127
I0703 15:10:22.879123 73968 solver.cpp:237] Train net output #0: loss = 0.664127 (* 1 = 0.664127 loss)
I0703 15:10:22.879123 73968 sgd_solver.cpp:105] Iteration 3100, lr = 0.001
I0703 15:10:25.912427 73968 solver.cpp:218] Iteration 3200 (32.9824 iter/s, 3.03192s/100 iters), loss = 0.726144
I0703 15:10:25.913427 73968 solver.cpp:237] Train net output #0: loss = 0.726144 (* 1 = 0.726144 loss)
I0703 15:10:25.914427 73968 sgd_solver.cpp:105] Iteration 3200, lr = 0.001
I0703 15:10:28.902726 73968 solver.cpp:218] Iteration 3300 (33.4837 iter/s, 2.98653s/100 iters), loss = 0.597564
I0703 15:10:28.902726 73968 solver.cpp:237] Train net output #0: loss = 0.597564 (* 1 = 0.597564 loss)
I0703 15:10:28.903726 73968 sgd_solver.cpp:105] Iteration 3300, lr = 0.001
I0703 15:10:31.947031 73968 solver.cpp:218] Iteration 3400 (32.872 iter/s, 3.0421s/100 iters), loss = 0.663627
I0703 15:10:31.948030 73968 solver.cpp:237] Train net output #0: loss = 0.663627 (* 1 = 0.663627 loss)
I0703 15:10:31.948030 73968 sgd_solver.cpp:105] Iteration 3400, lr = 0.001
I0703 15:10:34.875890 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:34.976495 73968 solver.cpp:330] Iteration 3500, Testing net (#0)
I0703 15:10:36.121947 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:36.185348 73968 solver.cpp:397] Test net output #0: accuracy = 0.7187
I0703 15:10:36.185348 73968 solver.cpp:397] Test net output #1: loss = 0.862067 (* 1 = 0.862067 loss)
I0703 15:10:36.216549 73968 solver.cpp:218] Iteration 3500 (23.5687 iter/s, 4.24291s/100 iters), loss = 0.560303
I0703 15:10:36.216549 73968 solver.cpp:237] Train net output #0: loss = 0.560303 (* 1 = 0.560303 loss)
I0703 15:10:36.216549 73968 sgd_solver.cpp:105] Iteration 3500, lr = 0.001
I0703 15:10:39.186511 73968 solver.cpp:218] Iteration 3600 (33.6513 iter/s, 2.97165s/100 iters), loss = 0.619981
I0703 15:10:39.186511 73968 solver.cpp:237] Train net output #0: loss = 0.619981 (* 1 = 0.619981 loss)
I0703 15:10:39.186511 73968 sgd_solver.cpp:105] Iteration 3600, lr = 0.001
I0703 15:10:42.190245 73968 solver.cpp:218] Iteration 3700 (33.4028 iter/s, 2.99376s/100 iters), loss = 0.798935
I0703 15:10:42.190245 73968 solver.cpp:237] Train net output #0: loss = 0.798935 (* 1 = 0.798935 loss)
I0703 15:10:42.190245 73968 sgd_solver.cpp:105] Iteration 3700, lr = 0.001
I0703 15:10:45.185573 73968 solver.cpp:218] Iteration 3800 (33.3392 iter/s, 2.99947s/100 iters), loss = 0.563457
I0703 15:10:45.185573 73968 solver.cpp:237] Train net output #0: loss = 0.563457 (* 1 = 0.563457 loss)
I0703 15:10:45.185573 73968 sgd_solver.cpp:105] Iteration 3800, lr = 0.001
I0703 15:10:48.180902 73968 solver.cpp:218] Iteration 3900 (33.4396 iter/s, 2.99047s/100 iters), loss = 0.629793
I0703 15:10:48.180902 73968 solver.cpp:237] Train net output #0: loss = 0.629793 (* 1 = 0.629793 loss)
I0703 15:10:48.180902 73968 sgd_solver.cpp:105] Iteration 3900, lr = 0.001
I0703 15:10:51.034423 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:51.128026 73968 solver.cpp:447] Snapshotting to binary proto file examples/cifar10/cifar10_quick_iter_4000.caffemodel
I0703 15:10:51.159227 73968 sgd_solver.cpp:273] Snapshotting solver state to binary proto file examples/cifar10/cifar10_quick_iter_4000.solverstate
I0703 15:10:51.174827 73968 solver.cpp:310] Iteration 4000, loss = 0.518084
I0703 15:10:51.174827 73968 solver.cpp:330] Iteration 4000, Testing net (#0)
I0703 15:10:52.338073 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:52.378077 73968 solver.cpp:397] Test net output #0: accuracy = 0.7254
I0703 15:10:52.379077 73968 solver.cpp:397] Test net output #1: loss = 0.841199 (* 1 = 0.841199 loss)
I0703 15:10:52.380077 73968 solver.cpp:315] Optimization Done.
I0703 15:10:52.380077 73968 caffe.cpp:260] Optimization Done.
D:\ws_caffe\caffe>
-----------------------------------------------------------------------------------------------------------
caffe train --solver=examples/cifar10/cifar10_quick_solver_lr1.prototxt --snapshot=examples/cifar10/cifar10_quick_iter_4000.solverstate
caffe test -model examples/cifar10/cifar10_quick_train_test.prototxt -weights examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5 -iterations 100
I0703 15:36:58.857549 67740 net.cpp:367] relu2 -> conv2 (in-place)
I0703 15:36:58.857549 67740 net.cpp:122] Setting up relu2
I0703 15:36:58.857549 67740 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:36:58.857549 67740 net.cpp:137] Memory required for data: 27444400
I0703 15:36:58.857549 67740 layer_factory.cpp:58] Creating layer pool2
I0703 15:36:58.857549 67740 net.cpp:84] Creating Layer pool2
I0703 15:36:58.857549 67740 net.cpp:406] pool2 <- conv2
I0703 15:36:58.857549 67740 net.cpp:380] pool2 -> pool2
I0703 15:36:58.857549 67740 net.cpp:122] Setting up pool2
I0703 15:36:58.857549 67740 net.cpp:129] Top shape: 100 32 8 8 (204800)
I0703 15:36:58.857549 67740 net.cpp:137] Memory required for data: 28263600
I0703 15:36:58.857549 67740 layer_factory.cpp:58] Creating layer conv3
I0703 15:36:58.857549 67740 net.cpp:84] Creating Layer conv3
I0703 15:36:58.857549 67740 net.cpp:406] conv3 <- pool2
I0703 15:36:58.857549 67740 net.cpp:380] conv3 -> conv3
I0703 15:36:58.873150 67740 net.cpp:122] Setting up conv3
I0703 15:36:58.873150 67740 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0703 15:36:58.873150 67740 net.cpp:137] Memory required for data: 29902000
I0703 15:36:58.873150 67740 layer_factory.cpp:58] Creating layer relu3
I0703 15:36:58.873150 67740 net.cpp:84] Creating Layer relu3
I0703 15:36:58.873150 67740 net.cpp:406] relu3 <- conv3
I0703 15:36:58.873150 67740 net.cpp:367] relu3 -> conv3 (in-place)
I0703 15:36:58.873150 67740 net.cpp:122] Setting up relu3
I0703 15:36:58.873150 67740 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0703 15:36:58.873150 67740 net.cpp:137] Memory required for data: 31540400
I0703 15:36:58.873150 67740 layer_factory.cpp:58] Creating layer pool3
I0703 15:36:58.873150 67740 net.cpp:84] Creating Layer pool3
I0703 15:36:58.873150 67740 net.cpp:406] pool3 <- conv3
I0703 15:36:58.873150 67740 net.cpp:380] pool3 -> pool3
I0703 15:36:58.873150 67740 net.cpp:122] Setting up pool3
I0703 15:36:58.873150 67740 net.cpp:129] Top shape: 100 64 4 4 (102400)
I0703 15:36:58.873150 67740 net.cpp:137] Memory required for data: 31950000
I0703 15:36:58.873150 67740 layer_factory.cpp:58] Creating layer ip1
I0703 15:36:58.873150 67740 net.cpp:84] Creating Layer ip1
I0703 15:36:58.873150 67740 net.cpp:406] ip1 <- pool3
I0703 15:36:58.873150 67740 net.cpp:380] ip1 -> ip1
I0703 15:36:58.873150 67740 net.cpp:122] Setting up ip1
I0703 15:36:58.888752 67740 net.cpp:129] Top shape: 100 64 (6400)
I0703 15:36:58.888752 67740 net.cpp:137] Memory required for data: 31975600
I0703 15:36:58.888752 67740 layer_factory.cpp:58] Creating layer ip2
I0703 15:36:58.888752 67740 net.cpp:84] Creating Layer ip2
I0703 15:36:58.888752 67740 net.cpp:406] ip2 <- ip1
I0703 15:36:58.888752 67740 net.cpp:380] ip2 -> ip2
I0703 15:36:58.888752 67740 net.cpp:122] Setting up ip2
I0703 15:36:58.888752 67740 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:36:58.888752 67740 net.cpp:137] Memory required for data: 31979600
I0703 15:36:58.888752 67740 layer_factory.cpp:58] Creating layer ip2_ip2_0_split
I0703 15:36:58.888752 67740 net.cpp:84] Creating Layer ip2_ip2_0_split
I0703 15:36:58.888752 67740 net.cpp:406] ip2_ip2_0_split <- ip2
I0703 15:36:58.888752 67740 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_0
I0703 15:36:58.888752 67740 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_1
I0703 15:36:58.888752 67740 net.cpp:122] Setting up ip2_ip2_0_split
I0703 15:36:58.888752 67740 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:36:58.888752 67740 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:36:58.888752 67740 net.cpp:137] Memory required for data: 31987600
I0703 15:36:58.888752 67740 layer_factory.cpp:58] Creating layer accuracy
I0703 15:36:58.888752 67740 net.cpp:84] Creating Layer accuracy
I0703 15:36:58.888752 67740 net.cpp:406] accuracy <- ip2_ip2_0_split_0
I0703 15:36:58.888752 67740 net.cpp:406] accuracy <- label_cifar_1_split_0
I0703 15:36:58.888752 67740 net.cpp:380] accuracy -> accuracy
I0703 15:36:58.888752 67740 net.cpp:122] Setting up accuracy
I0703 15:36:58.888752 67740 net.cpp:129] Top shape: (1)
I0703 15:36:58.888752 67740 net.cpp:137] Memory required for data: 31987604
I0703 15:36:58.888752 67740 layer_factory.cpp:58] Creating layer loss
I0703 15:36:58.888752 67740 net.cpp:84] Creating Layer loss
I0703 15:36:58.888752 67740 net.cpp:406] loss <- ip2_ip2_0_split_1
I0703 15:36:58.888752 67740 net.cpp:406] loss <- label_cifar_1_split_1
I0703 15:36:58.888752 67740 net.cpp:380] loss -> loss
I0703 15:36:58.888752 67740 layer_factory.cpp:58] Creating layer loss
I0703 15:36:58.888752 67740 net.cpp:122] Setting up loss
I0703 15:36:58.888752 67740 net.cpp:129] Top shape: (1)
I0703 15:36:58.888752 67740 net.cpp:132] with loss weight 1
I0703 15:36:58.888752 67740 net.cpp:137] Memory required for data: 31987608
I0703 15:36:58.888752 67740 net.cpp:198] loss needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:200] accuracy does not need backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] ip2_ip2_0_split needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] ip2 needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] ip1 needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] pool3 needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] relu3 needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] conv3 needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] pool2 needs backward computation.
I0703 15:36:58.904353 67740 net.cpp:198] relu2 needs backward computation.
I0703 15:36:58.904353 67740 net.cpp:198] conv2 needs backward computation.
I0703 15:36:58.904353 67740 net.cpp:198] relu1 needs backward computation.
I0703 15:36:58.904353 67740 net.cpp:198] pool1 needs backward computation.
I0703 15:36:58.904353 67740 net.cpp:198] conv1 needs backward computation.
I0703 15:36:58.904353 67740 net.cpp:200] label_cifar_1_split does not need backward computation.
I0703 15:36:58.904353 67740 net.cpp:200] cifar does not need backward computation.
I0703 15:36:58.904353 67740 net.cpp:242] This network produces output accuracy
I0703 15:36:58.904353 67740 net.cpp:242] This network produces output loss
I0703 15:36:58.904353 67740 net.cpp:255] Network initialization done.
I0703 15:36:58.951158 67740 hdf5.cpp:32] Datatype class: H5T_FLOAT
I0703 15:36:58.951158 67740 caffe.cpp:291] Running for 100 iterations.
I0703 15:36:59.341192 67740 caffe.cpp:314] Batch 0, accuracy = 0.79
I0703 15:36:59.341192 67740 caffe.cpp:314] Batch 0, loss = 0.650074
I0703 15:36:59.731227 67740 caffe.cpp:314] Batch 1, accuracy = 0.74
I0703 15:36:59.731227 67740 caffe.cpp:314] Batch 1, loss = 0.732871
I0703 15:37:00.105661 67740 caffe.cpp:314] Batch 2, accuracy = 0.75
I0703 15:37:00.105661 67740 caffe.cpp:314] Batch 2, loss = 0.75108
I0703 15:37:00.589304 67740 caffe.cpp:314] Batch 3, accuracy = 0.74
I0703 15:37:00.589304 67740 caffe.cpp:314] Batch 3, loss = 0.827869
I0703 15:37:00.994941 67740 caffe.cpp:314] Batch 4, accuracy = 0.71
I0703 15:37:00.994941 67740 caffe.cpp:314] Batch 4, loss = 0.783882
I0703 15:37:01.306969 67740 caffe.cpp:314] Batch 5, accuracy = 0.8
I0703 15:37:01.306969 67740 caffe.cpp:314] Batch 5, loss = 0.478434
I0703 15:37:01.618998 67740 caffe.cpp:314] Batch 6, accuracy = 0.78
I0703 15:37:01.618998 67740 caffe.cpp:314] Batch 6, loss = 0.629121
I0703 15:37:02.305459 67740 caffe.cpp:314] Batch 7, accuracy = 0.73
I0703 15:37:02.305459 67740 caffe.cpp:314] Batch 7, loss = 0.900917
I0703 15:37:02.617486 67740 caffe.cpp:314] Batch 8, accuracy = 0.75
I0703 15:37:02.617486 67740 caffe.cpp:314] Batch 8, loss = 0.756295
I0703 15:37:02.992920 67740 caffe.cpp:314] Batch 9, accuracy = 0.79
I0703 15:37:02.992920 67740 caffe.cpp:314] Batch 9, loss = 0.72793
I0703 15:37:03.304949 67740 caffe.cpp:314] Batch 10, accuracy = 0.83
I0703 15:37:03.304949 67740 caffe.cpp:314] Batch 10, loss = 0.678172
I0703 15:37:04.005412 67740 caffe.cpp:314] Batch 11, accuracy = 0.75
I0703 15:37:04.005412 67740 caffe.cpp:314] Batch 11, loss = 0.737166
I0703 15:37:04.333041 67740 caffe.cpp:314] Batch 12, accuracy = 0.76
I0703 15:37:04.333041 67740 caffe.cpp:314] Batch 12, loss = 0.590975
I0703 15:37:04.661671 67740 caffe.cpp:314] Batch 13, accuracy = 0.77
I0703 15:37:04.661671 67740 caffe.cpp:314] Batch 13, loss = 0.638049
I0703 15:37:05.136315 67740 caffe.cpp:314] Batch 14, accuracy = 0.77
I0703 15:37:05.136315 67740 caffe.cpp:314] Batch 14, loss = 0.64608
I0703 15:37:05.741775 67740 caffe.cpp:314] Batch 15, accuracy = 0.75
I0703 15:37:05.741775 67740 caffe.cpp:314] Batch 15, loss = 0.745028
I0703 15:37:06.101608 67740 caffe.cpp:314] Batch 16, accuracy = 0.78
I0703 15:37:06.101608 67740 caffe.cpp:314] Batch 16, loss = 0.783791
I0703 15:37:06.430238 67740 caffe.cpp:314] Batch 17, accuracy = 0.77
I0703 15:37:06.430238 67740 caffe.cpp:314] Batch 17, loss = 0.694237
I0703 15:37:06.835873 67740 caffe.cpp:314] Batch 18, accuracy = 0.76
I0703 15:37:06.835873 67740 caffe.cpp:314] Batch 18, loss = 0.8077
I0703 15:37:07.335119 67740 caffe.cpp:314] Batch 19, accuracy = 0.74
I0703 15:37:07.335119 67740 caffe.cpp:314] Batch 19, loss = 0.816296
I0703 15:37:07.647146 67740 caffe.cpp:314] Batch 20, accuracy = 0.76
I0703 15:37:07.647146 67740 caffe.cpp:314] Batch 20, loss = 0.778361
I0703 15:37:07.959174 67740 caffe.cpp:314] Batch 21, accuracy = 0.74
I0703 15:37:07.959174 67740 caffe.cpp:314] Batch 21, loss = 0.733748
I0703 15:37:08.396013 67740 caffe.cpp:314] Batch 22, accuracy = 0.78
I0703 15:37:08.396013 67740 caffe.cpp:314] Batch 22, loss = 0.767683
I0703 15:37:08.942062 67740 caffe.cpp:314] Batch 23, accuracy = 0.72
I0703 15:37:08.942062 67740 caffe.cpp:314] Batch 23, loss = 0.860217
I0703 15:37:09.254091 67740 caffe.cpp:314] Batch 24, accuracy = 0.75
I0703 15:37:09.254091 67740 caffe.cpp:314] Batch 24, loss = 0.902482
I0703 15:37:09.566118 67740 caffe.cpp:314] Batch 25, accuracy = 0.68
I0703 15:37:09.566118 67740 caffe.cpp:314] Batch 25, loss = 1.06045
I0703 15:37:10.065363 67740 caffe.cpp:314] Batch 26, accuracy = 0.81
I0703 15:37:10.065363 67740 caffe.cpp:314] Batch 26, loss = 0.599207
I0703 15:37:10.580209 67740 caffe.cpp:314] Batch 27, accuracy = 0.76
I0703 15:37:10.580209 67740 caffe.cpp:314] Batch 27, loss = 0.750922
I0703 15:37:10.939041 67740 caffe.cpp:314] Batch 28, accuracy = 0.78
I0703 15:37:10.939041 67740 caffe.cpp:314] Batch 28, loss = 0.669347
I0703 15:37:11.251070 67740 caffe.cpp:314] Batch 29, accuracy = 0.8
I0703 15:37:11.251070 67740 caffe.cpp:314] Batch 29, loss = 0.742157
I0703 15:37:11.563097 67740 caffe.cpp:314] Batch 30, accuracy = 0.74
I0703 15:37:11.563097 67740 caffe.cpp:314] Batch 30, loss = 0.699957
I0703 15:37:12.202755 67740 caffe.cpp:314] Batch 31, accuracy = 0.74
I0703 15:37:12.202755 67740 caffe.cpp:314] Batch 31, loss = 0.771317
I0703 15:37:12.514783 67740 caffe.cpp:314] Batch 32, accuracy = 0.77
I0703 15:37:12.514783 67740 caffe.cpp:314] Batch 32, loss = 0.717153
I0703 15:37:12.858013 67740 caffe.cpp:314] Batch 33, accuracy = 0.78
I0703 15:37:12.858013 67740 caffe.cpp:314] Batch 33, loss = 0.695417
I0703 15:37:13.185643 67740 caffe.cpp:314] Batch 34, accuracy = 0.66
I0703 15:37:13.185643 67740 caffe.cpp:314] Batch 34, loss = 1.03245
I0703 15:37:13.794097 67740 caffe.cpp:314] Batch 35, accuracy = 0.78
I0703 15:37:13.794097 67740 caffe.cpp:314] Batch 35, loss = 0.822289
I0703 15:37:14.184132 67740 caffe.cpp:314] Batch 36, accuracy = 0.76
I0703 15:37:14.184132 67740 caffe.cpp:314] Batch 36, loss = 0.775723
I0703 15:37:14.480559 67740 caffe.cpp:314] Batch 37, accuracy = 0.75
I0703 15:37:14.480559 67740 caffe.cpp:314] Batch 37, loss = 0.880586
I0703 15:37:14.776986 67740 caffe.cpp:314] Batch 38, accuracy = 0.74
I0703 15:37:14.776986 67740 caffe.cpp:314] Batch 38, loss = 0.686667
I0703 15:37:15.463448 67740 caffe.cpp:314] Batch 39, accuracy = 0.73
I0703 15:37:15.463448 67740 caffe.cpp:314] Batch 39, loss = 0.649401
I0703 15:37:15.750674 67740 caffe.cpp:314] Batch 40, accuracy = 0.78
I0703 15:37:15.750674 67740 caffe.cpp:314] Batch 40, loss = 0.715338
I0703 15:37:16.156311 67740 caffe.cpp:314] Batch 41, accuracy = 0.79
I0703 15:37:16.156311 67740 caffe.cpp:314] Batch 41, loss = 0.789637
I0703 15:37:16.483939 67740 caffe.cpp:314] Batch 42, accuracy = 0.83
I0703 15:37:16.483939 67740 caffe.cpp:314] Batch 42, loss = 0.512648
I0703 15:37:17.171201 67740 caffe.cpp:314] Batch 43, accuracy = 0.79
I0703 15:37:17.171201 67740 caffe.cpp:314] Batch 43, loss = 0.754063
I0703 15:37:17.527634 67740 caffe.cpp:314] Batch 44, accuracy = 0.84
I0703 15:37:17.527634 67740 caffe.cpp:314] Batch 44, loss = 0.630253
I0703 15:37:17.839663 67740 caffe.cpp:314] Batch 45, accuracy = 0.79
I0703 15:37:17.839663 67740 caffe.cpp:314] Batch 45, loss = 0.799423
I0703 15:37:18.277501 67740 caffe.cpp:314] Batch 46, accuracy = 0.78
I0703 15:37:18.277501 67740 caffe.cpp:314] Batch 46, loss = 0.653652
I0703 15:37:18.745543 67740 caffe.cpp:314] Batch 47, accuracy = 0.77
I0703 15:37:18.745543 67740 caffe.cpp:314] Batch 47, loss = 0.707297
I0703 15:37:19.073173 67740 caffe.cpp:314] Batch 48, accuracy = 0.81
I0703 15:37:19.073173 67740 caffe.cpp:314] Batch 48, loss = 0.565867
I0703 15:37:19.385200 67740 caffe.cpp:314] Batch 49, accuracy = 0.77
I0703 15:37:19.385200 67740 caffe.cpp:314] Batch 49, loss = 0.670153
I0703 15:37:19.697228 67740 caffe.cpp:314] Batch 50, accuracy = 0.77
I0703 15:37:19.697228 67740 caffe.cpp:314] Batch 50, loss = 0.594869
I0703 15:37:20.446095 67740 caffe.cpp:314] Batch 51, accuracy = 0.76
I0703 15:37:20.446095 67740 caffe.cpp:314] Batch 51, loss = 0.700686
I0703 15:37:20.758123 67740 caffe.cpp:314] Batch 52, accuracy = 0.75
I0703 15:37:20.758123 67740 caffe.cpp:314] Batch 52, loss = 0.621944
I0703 15:37:21.101354 67740 caffe.cpp:314] Batch 53, accuracy = 0.72
I0703 15:37:21.101354 67740 caffe.cpp:314] Batch 53, loss = 0.814192
I0703 15:37:21.444586 67740 caffe.cpp:314] Batch 54, accuracy = 0.77
I0703 15:37:21.444586 67740 caffe.cpp:314] Batch 54, loss = 0.692094
I0703 15:37:22.084242 67740 caffe.cpp:314] Batch 55, accuracy = 0.71
I0703 15:37:22.084242 67740 caffe.cpp:314] Batch 55, loss = 0.841092
I0703 15:37:22.396270 67740 caffe.cpp:314] Batch 56, accuracy = 0.76
I0703 15:37:22.396270 67740 caffe.cpp:314] Batch 56, loss = 0.880208
I0703 15:37:22.708298 67740 caffe.cpp:314] Batch 57, accuracy = 0.82
I0703 15:37:22.708298 67740 caffe.cpp:314] Batch 57, loss = 0.565382
I0703 15:37:23.051529 67740 caffe.cpp:314] Batch 58, accuracy = 0.72
I0703 15:37:23.051529 67740 caffe.cpp:314] Batch 58, loss = 0.812194
I0703 15:37:23.691187 67740 caffe.cpp:314] Batch 59, accuracy = 0.76
I0703 15:37:23.691187 67740 caffe.cpp:314] Batch 59, loss = 0.812453
I0703 15:37:24.034417 67740 caffe.cpp:314] Batch 60, accuracy = 0.84
I0703 15:37:24.034417 67740 caffe.cpp:314] Batch 60, loss = 0.606914
I0703 15:37:24.362047 67740 caffe.cpp:314] Batch 61, accuracy = 0.73
I0703 15:37:24.362047 67740 caffe.cpp:314] Batch 61, loss = 0.735381
I0703 15:37:24.705278 67740 caffe.cpp:314] Batch 62, accuracy = 0.77
I0703 15:37:24.705278 67740 caffe.cpp:314] Batch 62, loss = 0.642518
I0703 15:37:25.391739 67740 caffe.cpp:314] Batch 63, accuracy = 0.81
I0703 15:37:25.391739 67740 caffe.cpp:314] Batch 63, loss = 0.617093
I0703 15:37:25.719369 67740 caffe.cpp:314] Batch 64, accuracy = 0.74
I0703 15:37:25.719369 67740 caffe.cpp:314] Batch 64, loss = 0.747036
I0703 15:37:26.109405 67740 caffe.cpp:314] Batch 65, accuracy = 0.76
I0703 15:37:26.109405 67740 caffe.cpp:314] Batch 65, loss = 0.837057
I0703 15:37:26.639852 67740 caffe.cpp:314] Batch 66, accuracy = 0.76
I0703 15:37:26.639852 67740 caffe.cpp:314] Batch 66, loss = 0.689606
I0703 15:37:27.139096 67740 caffe.cpp:314] Batch 67, accuracy = 0.77
I0703 15:37:27.139096 67740 caffe.cpp:314] Batch 67, loss = 0.770931
I0703 15:37:27.451124 67740 caffe.cpp:314] Batch 68, accuracy = 0.75
I0703 15:37:27.451124 67740 caffe.cpp:314] Batch 68, loss = 0.715115
I0703 15:37:27.778753 67740 caffe.cpp:314] Batch 69, accuracy = 0.7
I0703 15:37:27.778753 67740 caffe.cpp:314] Batch 69, loss = 0.947363
I0703 15:37:28.215593 67740 caffe.cpp:314] Batch 70, accuracy = 0.79
I0703 15:37:28.215593 67740 caffe.cpp:314] Batch 70, loss = 0.720477
I0703 15:37:28.699236 67740 caffe.cpp:314] Batch 71, accuracy = 0.81
I0703 15:37:28.699236 67740 caffe.cpp:314] Batch 71, loss = 0.644322
I0703 15:37:29.058068 67740 caffe.cpp:314] Batch 72, accuracy = 0.78
I0703 15:37:29.058068 67740 caffe.cpp:314] Batch 72, loss = 0.576543
I0703 15:37:29.401299 67740 caffe.cpp:314] Batch 73, accuracy = 0.82
I0703 15:37:29.416900 67740 caffe.cpp:314] Batch 73, loss = 0.483146
I0703 15:37:30.243775 67740 caffe.cpp:314] Batch 74, accuracy = 0.77
I0703 15:37:30.243775 67740 caffe.cpp:314] Batch 74, loss = 0.795883
I0703 15:37:30.571404 67740 caffe.cpp:314] Batch 75, accuracy = 0.78
I0703 15:37:30.571404 67740 caffe.cpp:314] Batch 75, loss = 0.626209
I0703 15:37:30.899034 67740 caffe.cpp:314] Batch 76, accuracy = 0.74
I0703 15:37:30.899034 67740 caffe.cpp:314] Batch 76, loss = 0.791658
I0703 15:37:31.242264 67740 caffe.cpp:314] Batch 77, accuracy = 0.78
I0703 15:37:31.242264 67740 caffe.cpp:314] Batch 77, loss = 0.690147
I0703 15:37:31.850719 67740 caffe.cpp:314] Batch 78, accuracy = 0.79
I0703 15:37:31.850719 67740 caffe.cpp:314] Batch 78, loss = 0.64814
I0703 15:37:32.178349 67740 caffe.cpp:314] Batch 79, accuracy = 0.74
I0703 15:37:32.178349 67740 caffe.cpp:314] Batch 79, loss = 0.790071
I0703 15:37:32.505978 67740 caffe.cpp:314] Batch 80, accuracy = 0.73
I0703 15:37:32.505978 67740 caffe.cpp:314] Batch 80, loss = 0.66705
I0703 15:37:32.833607 67740 caffe.cpp:314] Batch 81, accuracy = 0.78
I0703 15:37:32.833607 67740 caffe.cpp:314] Batch 81, loss = 0.601753
I0703 15:37:33.535670 67740 caffe.cpp:314] Batch 82, accuracy = 0.79
I0703 15:37:33.535670 67740 caffe.cpp:314] Batch 82, loss = 0.580124
I0703 15:37:33.847698 67740 caffe.cpp:314] Batch 83, accuracy = 0.74
I0703 15:37:33.847698 67740 caffe.cpp:314] Batch 83, loss = 0.791201
I0703 15:37:34.206531 67740 caffe.cpp:314] Batch 84, accuracy = 0.73
I0703 15:37:34.206531 67740 caffe.cpp:314] Batch 84, loss = 0.912428
I0703 15:37:34.721376 67740 caffe.cpp:314] Batch 85, accuracy = 0.82
I0703 15:37:34.721376 67740 caffe.cpp:314] Batch 85, loss = 0.584871
I0703 15:37:35.251824 67740 caffe.cpp:314] Batch 86, accuracy = 0.77
I0703 15:37:35.251824 67740 caffe.cpp:314] Batch 86, loss = 0.65482
I0703 15:37:35.565852 67740 caffe.cpp:314] Batch 87, accuracy = 0.8
I0703 15:37:35.565852 67740 caffe.cpp:314] Batch 87, loss = 0.684847
I0703 15:37:35.893482 67740 caffe.cpp:314] Batch 88, accuracy = 0.74
I0703 15:37:35.893482 67740 caffe.cpp:314] Batch 88, loss = 0.70748
I0703 15:37:36.706955 67740 caffe.cpp:314] Batch 89, accuracy = 0.78
I0703 15:37:36.706955 67740 caffe.cpp:314] Batch 89, loss = 0.703976
I0703 15:37:37.050186 67740 caffe.cpp:314] Batch 90, accuracy = 0.72
I0703 15:37:37.050186 67740 caffe.cpp:314] Batch 90, loss = 0.78584
I0703 15:37:37.362215 67740 caffe.cpp:314] Batch 91, accuracy = 0.82
I0703 15:37:37.362215 67740 caffe.cpp:314] Batch 91, loss = 0.571601
I0703 15:37:37.736649 67740 caffe.cpp:314] Batch 92, accuracy = 0.71
I0703 15:37:37.736649 67740 caffe.cpp:314] Batch 92, loss = 0.875672
I0703 15:37:38.532320 67740 caffe.cpp:314] Batch 93, accuracy = 0.77
I0703 15:37:38.532320 67740 caffe.cpp:314] Batch 93, loss = 0.780201
I0703 15:37:38.922354 67740 caffe.cpp:314] Batch 94, accuracy = 0.74
I0703 15:37:38.922354 67740 caffe.cpp:314] Batch 94, loss = 0.690553
I0703 15:37:39.546411 67740 caffe.cpp:314] Batch 95, accuracy = 0.82
I0703 15:37:39.546411 67740 caffe.cpp:314] Batch 95, loss = 0.664873
I0703 15:37:39.562011 69448 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:37:40.061256 67740 caffe.cpp:314] Batch 96, accuracy = 0.75
I0703 15:37:40.061256 67740 caffe.cpp:314] Batch 96, loss = 0.770992
I0703 15:37:40.404487 67740 caffe.cpp:314] Batch 97, accuracy = 0.71
I0703 15:37:40.404487 67740 caffe.cpp:314] Batch 97, loss = 0.847697
I0703 15:37:40.747719 67740 caffe.cpp:314] Batch 98, accuracy = 0.75
I0703 15:37:40.747719 67740 caffe.cpp:314] Batch 98, loss = 0.72742
I0703 15:37:41.090950 67740 caffe.cpp:314] Batch 99, accuracy = 0.75
I0703 15:37:41.090950 67740 caffe.cpp:314] Batch 99, loss = 0.656324
I0703 15:37:41.090950 67740 caffe.cpp:319] Loss: 0.724403
I0703 15:37:41.090950 67740 caffe.cpp:331] accuracy = 0.7643
I0703 15:37:41.090950 67740 caffe.cpp:331] loss = 0.724403 (* 1 = 0.724403 loss)
D:\ws_caffe\caffe>