Skip to content

Prevent overflow in exponential link functions by clipping linear predictor#568

Closed
vasanthkumargr wants to merge 4 commits intodswah:mainfrom
vasanthkumargr:fix/log-link-overflow
Closed

Prevent overflow in exponential link functions by clipping linear predictor#568
vasanthkumargr wants to merge 4 commits intodswah:mainfrom
vasanthkumargr:fix/log-link-overflow

Conversation

@vasanthkumargr
Copy link
Copy Markdown

This PR addresses a RuntimeWarning: overflow encountered in exp in pygam/links.py.

When the linear predictor (lp) takes on very large values (e.g., during grid search with large regularization parameters), np.exp(lp) can overflow to infinity. This overflow propagates through subsequent computations—particularly in weight calculations such as self.link.gradient(mu, self.distribution) ** 2—and can result in NaN values (e.g., inf * 0).

Changes Made

Updated LogLink.mu and LogitLink.mu to clip the linear predictor:

lp = np.clip(lp, -500, 500)

before applying exponentiation.

Added regression tests in pygam/tests/test_links.py to verify stability for extreme lp values and prevent future regressions.

Prevents numerical overflow and resulting NaN propagation.
Improves stability during training, especially in edge cases (e.g., large lambda values).
Adds test coverage for extreme-value scenarios.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant