Neural network - More activation functions and the option of constructing custom functions

We need access to a much larger library of functions to choose from for the hidden layers of a neural network. It would also be great to be able to input custom functions and their derivatives.

Tracking Number:

Defect ID:

1 Comment
Level III

Could you add "Relu (Rectified Linear Unit)" activation function as well ? Thanks : )