After you select OK in the launch window, the Gradient-Boosted Trees Specification window appears.
A number such that 0 < r ≤ 1. Learning rates close to 1 result in faster convergence on a final tree, but also have a higher tendency to overfit data. Use learning rates closer to 1 when a small Number of Layers is specified. The learning rate is a small fraction typically between 0.01 and 0.1 that slows the convergence of the model. This preserves opportunities for later layers to use different splits than the earlier layers.
(Available only for categorical responses.) A biasing parameter that helps protect against fitting probabilities equal to zero.See Overfit Penalty.
Opens a window where you can select a data table containing values for some tuning parameters, called a tuning design table. A tuning design table has a column for each option that you want to specify and has one or multiple rows that each represent a single Boosted Tree model design. If an option is not specified in the tuning design table, the default value is used.