base_models
Base forest-related models using LightGBM as the gradient booster.
Classes
BaseLGBMRandomForest
class BaseLGBMRandomForest( gradient_boosting: bool = False, num_leaves: Optional[int] = None, max_depth: Optional[int] = None, subsample_for_bin: Optional[int] = None, num_iterations: Optional[int] = None, learning_rate: Optional[float] = None, reg_alpha: Optional[float] = None, reg_lambda: Optional[float] = None, bagging_freq: Optional[float] = None, bagging_fraction: Optional[float] = None, feature_fraction: Optional[float] = None, early_stopping_rounds: Optional[int] = None, verbose: Optional[int] = None, min_split_gain: Optional[float] = None, **kwargs: Any,):
Implements an (optionally Gradient Boosted) Random Forest from LightGBM.
Ancestors
- bitfount.models.base_models._BaseModel
- bitfount.models.base_models._BaseModelRegistryMixIn
- bitfount.types._BaseSerializableObjectMixIn
- abc.ABC
- typing.Generic
Methods
def deserialize(self, filename: Union[str, os.PathLike]) ‑> None:
Deserialize model.
def evaluate( self, test_dl: Optional[bitfount.data.dataloaders._BitfountDataLoader] = None, *args: Any, **kwargs: Any,) ‑> Tuple[numpy.ndarray, numpy.ndarray]:
Perform inference on test set and save dictionary of metrics.
def fit( self, data: Optional[BaseSource] = None, *args: Any, **kwargs: Any,) ‑> None:
Trains a model using the training set provided by the BaseSource object.
def get_params(self) ‑> Dict[~KT, ~VT]:
Create an instance of the model.
def serialize(self, filename: Union[str, os.PathLike]) ‑> None:
Serialize model.
Variables
- static
fields_dict : ClassVar[Dict[str, marshmallow.fields.Field]]