キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 
  • JMP will suspend normal business operations for our Winter Holiday beginning on Wednesday, Dec. 24, 2025, at 5:00 p.m. ET (2:00 p.m. ET for JMP Accounts Receivable).
    Regular business hours will resume at 9:00 a.m. EST on Friday, Jan. 2, 2026.
  • We’re retiring the File Exchange at the end of this year. The JMP Marketplace is now your destination for add-ins and extensions.

JMP Wish List

We want to hear your ideas for improving JMP. Share them here.
言語を選択 翻訳バーを非表示

Adding XGBoost, LightGBM & CatBoost modeling in JMP 'decision tree' menu

As you know, many of data scientist use XGBoost, LightGBM, and CatBoost (gradient boosting decision tree) to solve their problems by using Python.

Could you add these modelings in the decision tree menu (JMP) as well?

If it can be provided by JMP Script Language (JSL), it will be great as well. Thank you. : )

 

Reference: https://papers.nips.cc/paper/6907-lightgbm-a-highly-efficient-gradient-boosting-decision-tree.pdf

 

 

 

 

4件のコメント
Amir_H
Level III

Please add Extreme Gradient Boosting as mentioned in the above comment as well as Adaptive Boosting (AdaBoost). These 2 methods are very more accurate compared to most other techniques.

SamGardner
Level VII

XGBoost can now be utilized in JMP, if you download the XGBoost Addin from the File Exchange:  Betreff: XGBoost Add-In for JMP Pro 

 

Steve_Kim
Level IV

Thank you for the amazing update!:)
I tested XGboost addin, and it works well~! (Impressive for the value-added functionality !)
As you know, XGboost is a level-wise tree growth and MS LightGBM is a leaf-wise tree growth.
I hope JMP can add a leaf-wise growth method option someday.

変更されたステータス: Acknowledged

Hi @Steve_Kim, thank you for your suggestion! We have captured your request and will take it under consideration.