delete default value reasoning_max_tokens (#4250)
Some checks failed
CE Compile Job / ce_job_pre_check (push) Has been cancelled
CE Compile Job / print_ce_job_pre_check_outputs (push) Has been cancelled
CE Compile Job / FD-Clone-Linux (push) Has been cancelled
CE Compile Job / Show Code Archive Output (push) Has been cancelled
CE Compile Job / BUILD_SM8090 (push) Has been cancelled
CE Compile Job / BUILD_SM8689 (push) Has been cancelled
CE Compile Job / CE_UPLOAD (push) Has been cancelled

* delete default value reasoning_max_tokens

* Adjust max_tokens and reasoning_max_tokens logic
This commit is contained in:
Yuanle Liu
2025-09-26 10:42:27 +08:00
committed by GitHub
parent 213f15ef55
commit dcf633c4d9
3 changed files with 4 additions and 5 deletions

View File

@@ -210,9 +210,6 @@ class LLMEngine:
request.get("max_tokens"),
),
)
if request.get("reasoning_max_tokens") is None:
default_reasoning_max_tokens = max(int(request.get("max_tokens") * 0.8), 1)
request.set("reasoning_max_tokens", default_reasoning_max_tokens)
min_tokens = request.get("min_tokens")
if input_ids_len + min_tokens >= self.cfg.max_model_len:
error_msg = (