QuantileRegressor
Linear regression model that predicts conditional quantiles.
The linear QuantileRegressor
optimizes the pinball loss for a desired quantile
and is robust to outliers.
This model uses an L1 regularization like Lasso
.
Read more in the User Guide.
Python Reference (opens in a new tab)
Constructors
constructor()
Signature
new QuantileRegressor(opts?: object): QuantileRegressor;
Parameters
Name | Type | Description |
---|---|---|
opts? | object | - |
opts.alpha? | number | Regularization constant that multiplies the L1 penalty term. Default Value 1 |
opts.fit_intercept? | boolean | Whether or not to fit the intercept. Default Value true |
opts.quantile? | number | The quantile that the model tries to predict. It must be strictly between 0 and 1. If 0.5 (default), the model predicts the 50% quantile, i.e. the median. Default Value 0.5 |
opts.solver? | "highs-ds" | "highs-ipm" | "highs" | "interior-point" | "revised simplex" | Method used by scipy.optimize.linprog (opens in a new tab) to solve the linear programming formulation. From scipy>=1.6.0 , it is recommended to use the highs methods because they are the fastest ones. Solvers “highs-ds”, “highs-ipm” and “highs” support sparse input data and, in fact, always convert to sparse csc. From scipy>=1.11.0 , “interior-point” is not available anymore. Default Value 'interior-point' |
opts.solver_options? | any | Additional parameters passed to scipy.optimize.linprog (opens in a new tab) as options. If undefined and if solver='interior-point' , then {"lstsq": true} is passed to scipy.optimize.linprog (opens in a new tab) for the sake of stability. |
Returns
Defined in: generated/linear_model/QuantileRegressor.ts:27 (opens in a new tab)
Properties
_isDisposed
boolean
=false
Defined in: generated/linear_model/QuantileRegressor.ts:25 (opens in a new tab)
_isInitialized
boolean
=false
Defined in: generated/linear_model/QuantileRegressor.ts:24 (opens in a new tab)
_py
PythonBridge
Defined in: generated/linear_model/QuantileRegressor.ts:23 (opens in a new tab)
id
string
Defined in: generated/linear_model/QuantileRegressor.ts:20 (opens in a new tab)
opts
any
Defined in: generated/linear_model/QuantileRegressor.ts:21 (opens in a new tab)
Accessors
coef_
Estimated coefficients for the features.
Signature
coef_(): Promise<any[]>;
Returns
Promise
<any
[]>
Defined in: generated/linear_model/QuantileRegressor.ts:285 (opens in a new tab)
feature_names_in_
Names of features seen during fit. Defined only when X
has feature names that are all strings.
Signature
feature_names_in_(): Promise<ArrayLike>;
Returns
Promise
<ArrayLike
>
Defined in: generated/linear_model/QuantileRegressor.ts:366 (opens in a new tab)
intercept_
The intercept of the model, aka bias term.
Signature
intercept_(): Promise<number>;
Returns
Promise
<number
>
Defined in: generated/linear_model/QuantileRegressor.ts:312 (opens in a new tab)
n_features_in_
Number of features seen during fit.
Signature
n_features_in_(): Promise<number>;
Returns
Promise
<number
>
Defined in: generated/linear_model/QuantileRegressor.ts:339 (opens in a new tab)
n_iter_
The actual number of iterations performed by the solver.
Signature
n_iter_(): Promise<number>;
Returns
Promise
<number
>
Defined in: generated/linear_model/QuantileRegressor.ts:393 (opens in a new tab)
py
Signature
py(): PythonBridge;
Returns
PythonBridge
Defined in: generated/linear_model/QuantileRegressor.ts:74 (opens in a new tab)
Signature
py(pythonBridge: PythonBridge): void;
Parameters
Name | Type |
---|---|
pythonBridge | PythonBridge |
Returns
void
Defined in: generated/linear_model/QuantileRegressor.ts:78 (opens in a new tab)
Methods
dispose()
Disposes of the underlying Python resources.
Once dispose()
is called, the instance is no longer usable.
Signature
dispose(): Promise<void>;
Returns
Promise
<void
>
Defined in: generated/linear_model/QuantileRegressor.ts:133 (opens in a new tab)
fit()
Fit the model according to the given training data.
Signature
fit(opts: object): Promise<any>;
Parameters
Name | Type | Description |
---|---|---|
opts | object | - |
opts.X? | ArrayLike | Training data. |
opts.sample_weight? | ArrayLike | Sample weights. |
opts.y? | ArrayLike | Target values. |
Returns
Promise
<any
>
Defined in: generated/linear_model/QuantileRegressor.ts:150 (opens in a new tab)
init()
Initializes the underlying Python resources.
This instance is not usable until the Promise
returned by init()
resolves.
Signature
init(py: PythonBridge): Promise<void>;
Parameters
Name | Type |
---|---|
py | PythonBridge |
Returns
Promise
<void
>
Defined in: generated/linear_model/QuantileRegressor.ts:87 (opens in a new tab)
predict()
Predict using the linear model.
Signature
predict(opts: object): Promise<any>;
Parameters
Name | Type | Description |
---|---|---|
opts | object | - |
opts.X? | any | Samples. |
Returns
Promise
<any
>
Defined in: generated/linear_model/QuantileRegressor.ts:199 (opens in a new tab)
score()
Return the coefficient of determination of the prediction.
The coefficient of determination \(R^2\) is defined as \((1 - \frac{u}{v})\), where \(u\) is the residual sum of squares ((y\_true \- y\_pred)\*\* 2).sum()
and \(v\) is the total sum of squares ((y\_true \- y\_true.mean()) \*\* 2).sum()
. The best possible score is 1.0 and it can be negative (because the model can be arbitrarily worse). A constant model that always predicts the expected value of y
, disregarding the input features, would get a \(R^2\) score of 0.0.
Signature
score(opts: object): Promise<number>;
Parameters
Name | Type | Description |
---|---|---|
opts | object | - |
opts.X? | ArrayLike [] | Test samples. For some estimators this may be a precomputed kernel matrix or a list of generic objects instead with shape (n\_samples, n\_samples\_fitted) , where n\_samples\_fitted is the number of samples used in the fitting for the estimator. |
opts.sample_weight? | ArrayLike | Sample weights. |
opts.y? | ArrayLike | True values for X . |
Returns
Promise
<number
>
Defined in: generated/linear_model/QuantileRegressor.ts:236 (opens in a new tab)