Skip to content

Commit

Permalink
fix(gpt): drop deprecation usage of get_max_length()
Browse files Browse the repository at this point in the history
  • Loading branch information
fumiama committed Feb 18, 2025
1 parent bf0ec25 commit a500911
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion ChatTTS/model/gpt.py
Original file line number Diff line number Diff line change
Expand Up @@ -187,7 +187,10 @@ def _prepare_generation_inputs(
if cache_position is not None
else past_key_values.get_seq_length()
)
max_cache_length = past_key_values.get_max_length()
try:
max_cache_length = past_key_values.get_max_cache_shape()
except:
max_cache_length = past_key_values.get_max_length() # deprecated in transformers 4.48
cache_length = (
past_length
if max_cache_length is None
Expand Down

0 comments on commit a500911

Please sign in to comment.