Choose Language Hide Translation Bar
Highlighted
johnpjust
Level III

Custom activation function for Neural platform

The Neural Network platform is very handy but the activation function options could be expanded.  It should be very easy to add activation functions in concept -- but I'd like to know if it is possible to do it currently with JSL or other method?  At it's core I would think it amounts to providing a function and it's derivative.  Maybe an initialization strategy as well (but not so important with shallow networks like in JMP).

5 REPLIES 5
Highlighted
ian_jmp
Staff

Re: Custom activation function for Neural platform

This isn't possible at the moment. Do you have a situation in which you think this might help, and if so, could you describe it please?

Highlighted

Re: Custom activation function for Neural platform

Are you using JMP Pro? JMP Pro does offer a choice of several different activation functions. JMP Pro also offers two hidden layers rather than just the one hidden layer in JMP. I would definitely be curious as to what activation function you are interested in using.

Dan Obermiller
Highlighted
johnpjust
Level III

Re: Custom activation function for Neural platform

Yes I have JMP Pro and I've worked with the Nueral platform extensively. Mostly trying to use for regression and I see potential for sure -- also has a very good interface compared to using Python. However...There are only three kernel function options and the linear and Gaussian are not really useful IMO - the Gaussian is taken from SVMs I'm sure and does not seem to offer any advantage over tanh in my experience and the identity function doesn't distort higher order space and is not different than linear regression. The tanh function is good but I think it was intended for classification originally in ANNs. I can see the hard limits/saturations show up in regression actual by predicted plots and I also see undesirable boundary-like conditions in real-time prediction data where the inputs are slowly evolving over time - again an artifact of the tanh function. I would like to try an arcsinh (inverse hyperbolic sin) to alleviate this issue. I hypothesize that it's properties are more desirable for regression and may overcome the shortfalls of a tanh function for regression applications.

Highlighted
jwalk
Level III

Re: Custom activation function for Neural platform

I would also like to have more activation functions to choose from and/or the option of constructing a custom activation function.

Highlighted

Re: Custom activation function for Neural platform

Be sure to use the "JMP Wish List" link above. That is where you place your suggestions.

Dan Obermiller
Article Labels

    There are no labels assigned to this post.