You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey, I tried using XGBoost for quantile regression. I tweaked the example in the XGBoost API reference to include quantile likelihood. Here's my code
from darts.datasets import WeatherDataset
from darts.models import XGBModel
series = WeatherDataset().load()
from darts.dataprocessing.transformers import Scaler
transformer = Scaler()
scaled_series = transformer.fit_transform(series)
target = scaled_series['p (mbar)'][:100]
past_cov = scaled_series['rain (mm)'][:100]
future_cov = scaled_series['T (degC)'][:106]
model = XGBModel(lags=12,
lags_past_covariates=12,
lags_future_covariates=[0,1,2,3,4,5],
likelihood='quantile',
tree_method='hist',
output_chunk_length=1,
)
model.fit(target, past_covariates=past_cov, future_covariates=future_cov)
pred = model.historical_forecasts(target,past_covariates=past_cov,future_covariates=future_cov,retrain=False,num_samples=100)
target.plot(label="actual")
pred.plot(low_quantile=0.05, high_quantile=0.95, color='blue')
The result doesn't seem ideal:
I looked through the xgboost.py file and found that the model uses a custom objective function called 'xgb_quantile_loss()'. Since xgboost version 2.0 officially supports quantile regression, I attempted to modify the 'fit' function in the xgboost.py based on xgboost 2.0 documentation: https://xgboost.readthedocs.io/en/latest/python/examples/quantile_regression.html
for quantile in self.quantiles:
#I commented the origin code
# obj_func = partial(xgb_quantile_loss, quantile=quantile)
# self.kwargs["objective"] = obj_func
self.kwargs["objective"] = "reg:quantileerror"
self.kwargs["quantile_alpha"] = quantile
self.model = xgb.XGBRegressor(**self.kwargs)
This time the results looks better:
I believe there shouldn't be much difference between the custom loss function in Darts and the 'reg:quantileerror' in xgboost 2.0. However, I suspect that there might be specific algorithm optimizations for quantile regression that are triggered when using "reg:quantileerror". These optimizations could potentially lead to better results.
I'm wondering if we should leverage the new features in xgboost 2.0 or if there are better ways to optimize the results of xgboost quantile regression.
Thanks
The text was updated successfully, but these errors were encountered:
Hey, I tried using XGBoost for quantile regression. I tweaked the example in the XGBoost API reference to include quantile likelihood. Here's my code
The result doesn't seem ideal:
I looked through the xgboost.py file and found that the model uses a custom objective function called 'xgb_quantile_loss()'. Since xgboost version 2.0 officially supports quantile regression, I attempted to modify the 'fit' function in the xgboost.py based on xgboost 2.0 documentation:
https://xgboost.readthedocs.io/en/latest/python/examples/quantile_regression.html
This time the results looks better:
I believe there shouldn't be much difference between the custom loss function in Darts and the 'reg:quantileerror' in xgboost 2.0. However, I suspect that there might be specific algorithm optimizations for quantile regression that are triggered when using "reg:quantileerror". These optimizations could potentially lead to better results.
I'm wondering if we should leverage the new features in xgboost 2.0 or if there are better ways to optimize the results of xgboost quantile regression.
Thanks
The text was updated successfully, but these errors were encountered: