Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Normalization Term Impact on Gradient and Hessian Scaling #44

Closed
RektPunk opened this issue Oct 9, 2024 · 0 comments · Fixed by #47
Closed

Normalization Term Impact on Gradient and Hessian Scaling #44

RektPunk opened this issue Oct 9, 2024 · 0 comments · Fixed by #47
Assignees
Labels
enhancement make it efficient or clear

Comments

@RektPunk
Copy link
Owner

RektPunk commented Oct 9, 2024

It was noted that the normalization term is affecting both the gradient and Hessian, while other aspects remain unchanged. Currently, for the normalization parameter to take effect, it needs to be set to a significantly large value.
However, if we divide by N (the number of data points), the normalization effect should be noticeable with smaller values.

For future updates, simply divide the gradient and Hessian by the number of data points to address this issue.

_err_for_alpha = _y_train[alpha_inx] - _y_pred[alpha_inx]
_grad = grad_fn(error=_err_for_alpha, alpha=alphas[alpha_inx], **kwargs)
_hess = hess_fn(error=_err_for_alpha, alpha=alphas[alpha_inx], **kwargs)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement make it efficient or clear
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant