cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
  • Sign-in to the JMP Community will be unavailable intermittently Dec. 6-7 due to a system update. Thank you for your understanding!
  • We’re retiring the File Exchange at the end of this year. The JMP Marketplace is now your destination for add-ins and extensions.
  • JMP 19 is here! Learn more about the new features.

JMP Wish List

We want to hear your ideas for improving JMP. Share them here.
Choose Language Hide Translation Bar

Neural network - More activation functions and the option of constructing custom functions

We need access to a much larger library of functions to choose from for the hidden layers of a neural network. It would also be great to be able to input custom functions and their derivatives.

5 Comments
Steve_Kim
Level IV

Could you add "Relu (Rectified Linear Unit)" activation function as well ? Thanks : )

craigwb
Level I

Selu and Relu would be great.  They are used in a lot of deep learning methods.  More layers please!

shampton82
Level VII

Can you add argmax  and softmax as well?  I second the relu request as well!

Status changed to: Acknowledged

Hi @jwalk, thank you for your suggestion! We have captured your request and will take it under consideration.

mia_stephens
Staff

While not directly available in JMP Pro, the Torch Deep Learning Add-in for JMP Pro provides a wide range of activation functions.