HomePhabricator
Diffusion MITK 96d41fe55459

Change to the relu initialization recommended by TensorFlow.

Description

Change to the relu initialization recommended by TensorFlow.

In the TensorFlow tutorial, relus are initialized by truncated Gaussians with
small offsets to avoid them being initialized with zero activation.
Before we had them initialized just with the Gaussian centered around 0.

This initialization leads to much faster convergence.

Details

Provenance
wirkertAuthored on Apr 6 2016, 11:22 AM
Parents
rMITKc712c854ba82: First working version of neural network regression.
Branches
Unknown
Tags
Unknown