Install pyfann and run some tests to verify

alinafe.matenda % brew install cmake /usr/local/lib/python2.7/site-packages/python-fann/FANN-2.2.0-Source
==> Downloading https://homebrew.bintray.com/bottles/cmake-3.5.2.el_capitan.bottle.tar.gz
######################################################################## 100.0%
==> Pouring cmake-3.5.2.el_capitan.bottle.tar.gz
==> Caveats
Emacs Lisp files have been installed to:
/usr/local/share/emacs/site-lisp/cmake
==> Summary
🍺 /usr/local/Cellar/cmake/3.5.2: 2,010 files, 27.5M
alinafe.matenda % cmake . /usr/local/lib/python2.7/site-packages/python-fann/FANN-2.2.0-Source
— The C compiler identification is AppleClang 7.3.0.7030031
— The CXX compiler identification is AppleClang 7.3.0.7030031
— Check for working C compiler: /Library/Developer/CommandLineTools/usr/bin/cc
— Check for working C compiler: /Library/Developer/CommandLineTools/usr/bin/cc — works
— Detecting C compiler ABI info
— Detecting C compiler ABI info – done
— Detecting C compile features
— Detecting C compile features – done
— Check for working CXX compiler: /Library/Developer/CommandLineTools/usr/bin/c++
— Check for working CXX compiler: /Library/Developer/CommandLineTools/usr/bin/c++ — works
-…………………
alinafe.matenda% sudo make install

……….
[100%] Built target doublefann
Install the project…
— Install configuration: “”
— Installing: /usr/local/lib/pkgconfig/fann.pc
— Installing: /usr/local/lib/libfloatfann.2.2.0.dylib
— Installing: /usr/local/lib/libfloatfann.2.dylib
— Installing: /usr/local/lib/libfloatfann.dylib
— Installing: /usr/local/lib/libdoublefann.2.2.0.dylib
— Installing: /usr/local/lib/libdoublefann.2.dylib
— Installing: /usr/local/lib/libdoublefann.dylib
— Installing: /usr/local/lib/libfixedfann.2.2.0.dylib
— Installing: /usr/local/lib/libfixedfann.2.dylib
— Installing: /usr/local/lib/libfixedfann.dylib
— Installing: /usr/local/lib/libfann.2.2.0.dylib
— Installing: /usr/local/lib/libfann.2.dylib
— Installing: /usr/local/lib/libfann.dylib
— Installing: /usr/local/include/fann.h
— Installing: /usr/local/include/doublefann.h
— Installing: /usr/local/include/fann_internal.h
— Installing: /usr/local/include/floatfann.h
— Installing: /usr/local/include/fann_data.h
— Installing: /usr/local/include/fixedfann.h
— Installing: /usr/local/include/compat_time.h
— Installing: /usr/local/include/fann_activation.h
— Installing: /usr/local/include/fann_cascade.h
— Installing: /usr/local/include/fann_error.h
— Installing: /usr/local/include/fann_train.h
— Installing: /usr/local/include/fann_io.h
— Installing: /usr/local/include/fann_cpp.h

alinafe.matenda% pwd

/usr/local/lib/python2.7/site-packages/python-fann/FANN-2.2.0-Source

alinafe.matenda% make runtest                                                  

/usr/local/lib/python2.7/site-packages/python-fann/FANN-2.2.0-Source/examples

gcc -O3 xor_train.c -o xor_train -lfann -lm

gcc -O3 xor_test.c -o xor_test -lfann -lm

gcc -O3 -DFIXEDFANN xor_test.c -o xor_test_fixed -lfixedfann -lm

gcc -O3 simple_train.c -o simple_train -lfann -lm

gcc -O3 steepness_train.c -o steepness_train -lfann -lm

gcc -O3 simple_test.c -o simple_test -lfann -lm

gcc -O3 robot.c -o robot -lfann -lm

gcc -O3 mushroom.c -o mushroom -lfann -lm

gcc -O3 cascade_train.c -o cascade_train -lfann -lm

gcc -O3 scaling_test.c -o scaling_test -lfann -lm

gcc -O3 scaling_train.c -o scaling_train -lfann -lm

Training network

./xor_train

Creating network.

Training network.

Max epochs     1000. Desired error: 0.0000000000.

Epochs            1. Current error: 0.3544595838. Bit fail 4.

Epochs           10. Current error: 0.0412630290. Bit fail 4.

Epochs           20. Current error: 0.0001692077. Bit fail 3.

Epochs           23. Current error: 0.0000335279. Bit fail 0.

Testing network. 0.000017

XOR test (-1.000000,-1.000000) -> -0.995908, should be -1.000000, difference=0.004092

XOR test (-1.000000,1.000000) -> 0.999355, should be 1.000000, difference=0.000645

XOR test (1.000000,-1.000000) -> 0.986100, should be 1.000000, difference=0.013900

XOR test (1.000000,1.000000) -> -0.992541, should be -1.000000, difference=0.007459

Saving network.

Cleaning up.

Testing network with floats

./xor_test

Creating network.

Layer / Neuron 0123456

L   1 / N    3 BCc….

L   1 / N    4 CBB….

L   1 / N    5 cCd….

L   1 / N    6 …….

L   2 / N    7 …dDDb

L   2 / N    8 …….

Input layer                          :   2 neurons, 1 bias

  Hidden layer                       :   3 neurons, 1 bias

Output layer                         :   1 neurons

Total neurons and biases             :   8

Total connections                    :  13

Connection rate                      :   1.000

Network type                         :   FANN_NETTYPE_LAYER

Training algorithm                   :   FANN_TRAIN_RPROP

Training error function              :   FANN_ERRORFUNC_TANH

Training stop function               :   FANN_STOPFUNC_BIT

Bit fail limit                       :   0.010

Learning rate                        :   0.700

Learning momentum                    :   0.000

Quickprop decay                      :  -0.000100

Quickprop mu                         :   1.750

RPROP increase factor                :   1.200

RPROP decrease factor                :   0.500

RPROP delta min                      :   0.000

RPROP delta max                      :  50.000

Cascade output change fraction       :   0.010000

Cascade candidate change fraction    :   0.010000

Cascade output stagnation epochs     :  12

Cascade candidate stagnation epochs  :  12

Cascade max output epochs            : 150

Cascade min output epochs            :  50

Cascade max candidate epochs         : 150

Cascade min candidate epochs         :  50

Cascade weight multiplier            :   0.400

Cascade candidate limit              :1000.000

Cascade activation functions[0]      :   FANN_SIGMOID

Cascade activation functions[1]      :   FANN_SIGMOID_SYMMETRIC

Cascade activation functions[2]      :   FANN_GAUSSIAN

Cascade activation functions[3]      :   FANN_GAUSSIAN_SYMMETRIC

Cascade activation functions[4]      :   FANN_ELLIOT

Cascade activation functions[5]      :   FANN_ELLIOT_SYMMETRIC

Cascade activation functions[6]      :   FANN_SIN_SYMMETRIC

Cascade activation functions[7]      :   FANN_COS_SYMMETRIC

Cascade activation functions[8]      :   FANN_SIN

Cascade activation functions[9]      :   FANN_COS

Cascade activation steepnesses[0]    :   0.250

Cascade activation steepnesses[1]    :   0.500

Cascade activation steepnesses[2]    :   0.750

Cascade activation steepnesses[3]    :   1.000

Cascade candidate groups             :   2

Cascade no. of candidates            :  80

Testing network.

XOR test (-1.000000, -1.000000) -> -0.995908, should be -1.000000, difference=0.004092

XOR test (-1.000000, 1.000000) -> 0.999355, should be 1.000000, difference=0.000645

XOR test (1.000000, -1.000000) -> 0.986100, should be 1.000000, difference=0.013900

XOR test (1.000000, 1.000000) -> -0.992541, should be -1.000000, difference=0.007459

Cleaning up.

Testing network with fixed points

./xor_test_fixed

Creating network.

Layer / Neuron 0123456

L   1 / N    3 BCc….

L   1 / N    4 CBB….

L   1 / N    5 cCd….

L   1 / N    6 …….

L   2 / N    7 …dDDb

L   2 / N    8 …….

Input layer                          :   2 neurons, 1 bias

  Hidden layer                       :   3 neurons, 1 bias

Output layer                         :   1 neurons

Total neurons and biases             :   8

Total connections                    :  13

Connection rate                      :   1.000

Network type                         :   FANN_NETTYPE_LAYER

Decimal point                        :  13

Multiplier                           :8192

Bit fail limit                       :  82

Testing network.

XOR test (-8192, -8192) -> -8192, should be -8192, difference=0.000000

XOR test (-8192, 8192) -> 8192, should be 8192, difference=0.000000

XOR test (8192, -8192) -> 7921, should be 8192, difference=0.033081

XOR test (8192, 8192) -> -8192, should be -8192, difference=0.000000

Cleaning up.

reference:

http://leenissen.dk/fann/wp/help/installing-fann/

Advertisements