We need access to a much larger library of functions to choose from for the hidden layers of a neural network. It would also be great to be able to input custom functions and their derivatives.
Could you add "Relu (Rectified Linear Unit)" activation function as well ? Thanks : )
Selu and Relu would be great. They are used in a lot of deep learning methods. More layers please!
Can you add argmax and softmax as well? I second the relu request as well!
Hi @jwalk, thank you for your suggestion! We have captured your request and will take it under consideration.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.