GitXplorerGitXplorer
d

MLP

public
73 stars
21 forks
5 issues

Commits

List of commits on branch master.
Verified
035e7c33b2d6b644188f855761284efd91e2e1d9

Merge pull request #11 from rlunaro/internal-weights-query

ddavidalbertonogueira committed 6 years ago
Unverified
e7ac6e6951032b7db404230b133dd11cb4d2980a

fixed a typo in mpl.h and an omitted include in Sample.h

rrlunaro committed 6 years ago
Verified
f3bb556188c57ec022923e75c25c83440dcbe11a

Merge pull request #10 from rlunaro/master

ddavidalbertonogueira committed 6 years ago
Unverified
72f2688c7d7241c63c7b83664601da62e8b6c44f

finished of correcting "int" to "size_t" to avoid nasty errors and

rrlunaro committed 6 years ago
Unverified
f152b3030b332a00e537d8858df5347c6579fd4d

added posibility to change internal weights of the network directly

rrlunaro committed 6 years ago
Unverified
aa3453baa84ea0bf72da6293203632f8437d9db0

fix in clean command in Makefile

rrlunaro committed 6 years ago

README

The README file for this repository.

MLP logo

MLP

About

MLP stands for multilayer perceptron. This project is a simple & fast C++ implementation of a MLP, oriented towards hacking and rapid prototyping. It is well-tested and includes multiple tests for each component as well as use cases.

This project is maintained by David Nogueira.

Featuring

  • C++ implementation.
  • Modular-oriented, with classes built on top of each other: Node, Layer and network classes.
  • Easy to use and to hack.
  • Simple, fast and thread-safe.
  • Tests for each component module as well as use-case tests.
  • Supports saving & loading models.

OS Support

MLP offers support both for Windows (MVS) & Linux (g++/clang++).

Tests/Example Code

Some example programs are included with the source code.

  • IrisDatasetTest.cpp - Using the IRIS data-set trains a MLP using backpropagation and tries to predict the classes.
  • MLPTest.cpp - Includes tests to train a MLP for AND, NAND, NOR, OR, NOT and XOR using backpropagation.
  • NodeTest.cpp - Includes tests to train a single node (aka, perceptron) for AND, NAND, NOR, OR, NOT and XOR using backpropagation. (A simple perceptron cannot learn the XOR function.)

Example

Let us look at an example. After loading the data and creating the training/dev/test data structures, we will create a MLP with input size 5 (assuming 4 input data features + 1 bias), a hidden layer of 4 neuros and an output layer with 3 outputs (3 possible predicted classes). The activation functions will be a sigmoid for the hidden layer and a linear one for the output layer.

#include "MLP.h"
#include <vector>

// ...
std::vector<TrainingSample> training_set;
// ...

// assuming 4 inputs + 1 bias.
// 1 hidden layer(s) of 4 neurons.
// assuming 3 outputs
MLP my_mlp({ 4 + 1, 4 , 3 }, { "sigmoid", "linear" }, false);

int loops = 5000;
my_mlp.Train(training_set, .01, loops, 0.10, false);

int correct = 0;
for (int j = 0; j < samples; ++j) {
  std::vector<double> guess;
  my_mlp.GetOutput(training_set[j].input_vector(), &guess);
  size_t class_id;
  my_mlp.GetOutputClass(guess, &class_id);

  // Compare class_id with gold class id for each instance
}

Saving and loading models is also very intuitive:

#include "MLP.h"
{
  std::string model = "../../data/iris.mlp";
  //...
  my_mlp.SaveMLPNetwork(model); //saving
}
{
  MLP my_mlp(model); //load a model in constructor
  //...
}