% Generated by roxygen2: do not edit by hand % Please edit documentation in R/lgb.importance.R \name{lgb.importance} \alias{lgb.importance} \title{Compute feature importance in a model} \usage{ lgb.importance(model, percentage = TRUE) } \arguments{ \item{model}{object of class \code{lgb.Booster}.} \item{percentage}{whether to show importance in relative percentage.} } \value{ For a tree model, a \code{data.table} with the following columns: \itemize{ \item \code{Feature} Feature names in the model. \item \code{Gain} The total gain of this feature's splits. \item \code{Cover} The number of observation related to this feature. \item \code{Frequency} The number of times a feature splited in trees. } } \description{ Creates a \code{data.table} of feature importances in a model. } \examples{ data(agaricus.train, package = 'lightgbm') train <- agaricus.train dtrain <- lgb.Dataset(train$data, label = train$label) params = list(objective = "binary", learning_rate = 0.01, num_leaves = 63, max_depth = -1, min_data_in_leaf = 1, min_sum_hessian_in_leaf = 1) model <- lgb.train(params, dtrain, 20) model <- lgb.train(params, dtrain, 20) tree_imp1 <- lgb.importance(model, percentage = TRUE) tree_imp2 <- lgb.importance(model, percentage = FALSE) }