Clay Barker has been busy extending the usefulness of the Generalized Regression platform in JMP Pro, adding many new models and enhancing ease of use. Generalized Regression (or GenReg for short) debuted in JMP Pro 11 as the place to do a trio of popular penalized regression techniques: Lasso, Elastic Net and Ridge. These penalized techniques are attractive because they lead to simpler models that are less prone to overfitting. After adding more modeling and selection techniques in JMP Pro 12 and now JMP 13, GenReg has become a place to do variable selection quickly and easily for a wide variety of problems.
Variable selection is where much of the “art” is in model building, and even more so with ever-wider data.
“Customers have been asking for variable selection for time-to-event data for some time, and now GenReg will be able to do that. There are not a lot of easy options for doing variable selection across so many different scenarios” says Clay, a Senior Research Statistician Developer at JMP.
One of the biggest enhancements in the Generalized Regression platform for JMP Pro 13 is the ability to handle censored data to do parametric survival analysis and proportional hazards models. Fellow developer Peng Liu even added a link to the Generalized Regression platform from within the Survival Analysis platform (in JMP Pro).
Variable selection is an active area of research in statistics. To do variable selection well and build really useful models, you need a breadth of tools and even hybrid approaches using automated selection techniques like the Lasso and Elastic Net. But the interactive nature of GenReg makes it easy to adopt a hybrid strategy where you can easily explore alternative models supported by the automated selection technique. Now JMP Pro provides some powerful new variable selection methods, including a modified two-stage forward selection method — first on the main effects and then on the higher-order effects involving the main effects, making Generalized Regression a premier tool for analyzing designed experiments.
Also new in JMP Pro 13 is the Double Lasso, a two-stage modeling technique where a first pass of the Lasso screens for variables to select and then a second pass of the Lasso is done on the variables selected in the first pass. Doing two passes of the lasso effectively separates the selection and shrinkage process of the Lasso, which can lead to better predictions. Another highlight is the addition of the Extended Regularized Information Criterion (ERIC). ERIC is similar in spirit to the Bayesian Information Criterion, but it was derived specifically for the Adaptive Lasso.
You can find out more about what’s coming in JMP Pro 13 by visiting the preview page on our website. There, you can sign up to watch a live stream of JMP chief architect John Sall’s tour of JMP 13 on Sept. 21, as well as watch short videos about JMP 13 and JMP Pro 13.
The post JMP 13 Preview: More enhancements to generalized regression appeared first on JMP Blog.