GitXplorerGitXplorer
k

nnetLib-gpuArray

public
2 stars
3 forks
0 issues

Commits

List of commits on branch master.
Unverified
910c32e732a641def62f1c625c3bd6e63c094704

Update README.md

kkloudkl committed 11 years ago
Unverified
e639715d8f9f090da852e5837594eee493b30e71

Update README.md

kkloudkl committed 11 years ago
Unverified
1cf9123b2dcd08901e649b1aa7837be9053c3e77

Update README.md

kkloudkl committed 11 years ago
Unverified
98f8b0f4d951ae991075789006835fc9ac7485dd

Create README.md

kkloudkl committed 11 years ago
Unverified
3edd4fb3bb046258e4b03be94be5f35c0147e0d5

Added adaptive learning rate AdaGrad & AdaDec using gpuArray

kkloudkl committed 11 years ago

README

The README file for this repository.

nnetLib-gpuArray

Fast deep neural network with adaptive learning rate AdaGrad and AdaDec using gpuArray

This project was forked from Nicolas Le Roux's nnetLib which you can find on his website and MATLAB Central File Exchange.

The major improvements are porting to GPU and adding adaptive learning rate.

Training deep neural network on GPU is more than 10 times faster than on CPU. So we can experiment and tune algorithms with various parameter combinations in fast iterative cycles.

Adaptive learning rate scheduling policies set parameter wise learning rate based on the gradient histories of each parameter. AdaGrad uses 1 ./ sqrt(K + sum(gradient_histories .^ 2)). AdaDec extends AdaGrad by introduces forgetting factor to do exponential averaging of the sum of squared gradients. The denominator becomes sqrt(K + S_t(gradient_histories)) in which S_t = forgetting_factor * S_t-1 + sum_h=0^N(gradient_h .^ 2). In pratice forgetting_factor is set to 0.999 or some similar values. When forgetting_factor is set to 1 and exponential averaging window size N is 1, AdaDec falls back to AdaGrad. The biggest drawback of AdaDec is the memory requirements grows proportionally with the window size. In large deep network, there are millions to billions of parameters. So the value of N is bounded by the available GPU memory.