fix: 调整 ThorClaude 聊天默认 max_tokens 从 2048 到 100000
修改文件: Yi.Framework.AiHub.Domain/AiGateWay/Impl/ThorClaude/Chats/ClaudiaChatCompletionsService.cs 说明: - 将默认 max_tokens 由 2048 提高到 100000,避免长回复被截断,提升对大输出场景的支持。 - 修改可能影响请求的响应长度与资源消耗,请确认后端/模型能够支持该上限并监控性能与计费变化。
This commit is contained in:
@@ -345,7 +345,7 @@ public sealed class ClaudiaChatCompletionsService(
|
||||
var response = await client.HttpRequestRaw(options.Endpoint.TrimEnd('/') + "/v1/messages", new
|
||||
{
|
||||
model = input.Model,
|
||||
max_tokens = input.MaxTokens ?? 2048,
|
||||
max_tokens = input.MaxTokens ?? 100000,
|
||||
stream = true,
|
||||
tool_choice,
|
||||
system = CreateMessage(input.Messages.Where(x => x.Role == "system").ToList(), options),
|
||||
|
||||
Reference in New Issue
Block a user