We need access to a much larger library of functions to choose from for the hidden layers of a neural network. It would also be great to be able to input custom functions and their derivatives.
Could you add "Relu (Rectified Linear Unit)" activation function as well ? Thanks : )
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.