Skip to content

Commit

Permalink
Deprecate RAdam optimizer properly. (#389)
Browse files Browse the repository at this point in the history
  • Loading branch information
jettify authored Oct 31, 2021
1 parent 51e361c commit 64b6782
Show file tree
Hide file tree
Showing 4 changed files with 17 additions and 1 deletion.
4 changes: 4 additions & 0 deletions CHANGES.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,10 @@
Changes
-------

0.3.1 (YYYY-MM-DD)
------------------
* Deprecate RAdam optimizer.

0.3.0 (2021-10-30)
------------------
* Revert for Drop RAdam.
Expand Down
2 changes: 2 additions & 0 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -775,6 +775,8 @@ RAdam
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_RAdam.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_RAdam.png |
+---------------------------------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------+

Deprecated, please use version provided by PyTorch_.

.. code:: python
import torch_optimizer as optim
Expand Down
2 changes: 1 addition & 1 deletion torch_optimizer/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@
# utils
'get',
)
__version__ = '0.3.0'
__version__ = '0.3.1a0'


_package_opts = [
Expand Down
10 changes: 10 additions & 0 deletions torch_optimizer/radam.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import math
import warnings

import torch
from torch.optim.optimizer import Optimizer
Expand All @@ -11,6 +12,9 @@
class RAdam(Optimizer):
r"""Implements RAdam optimization algorithm.
Note:
Deprecated, please use version provided by PyTorch_.
It has been proposed in `On the Variance of the Adaptive Learning
Rate and Beyond`__.
Expand Down Expand Up @@ -45,6 +49,12 @@ def __init__(
eps: float = 1e-8,
weight_decay: float = 0,
) -> None:
warnings.warn(
'RAdam optimizer is deprecated, since it is included '
'in pytorch natively.',
DeprecationWarning,
stacklevel=2,
)
if lr <= 0.0:
raise ValueError('Invalid learning rate: {}'.format(lr))
if eps < 0.0:
Expand Down

0 comments on commit 64b6782

Please sign in to comment.