-
Notifications
You must be signed in to change notification settings - Fork 880
Open
Description
I tried ollama deepseek-r1:14b and from lm studio: gpt-oss 20b and others, all failed.
for deepseek, this is the log:
21:18:47 - LiteLLM Proxy:ERROR: common_request_processing.py:644 - litellm.proxy.proxy_server._handle_llm_api_exception(): Exception occured - litellm.APIConnectionError: list index out of range
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3254, in completion
response = base_llm_http_handler.completion(
model=model,
...<13 lines>...
client=client,
)
File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 330, in completion
data = provider_config.transform_request(
model=model,
...<3 lines>...
headers=headers,
)
File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 358, in transform_request
modified_prompt = ollama_pt(model=model, messages=messages)
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
tool_calls = messages[msg_i].get("tool_calls")
~~~~~~~~^^^^^^^
IndexError: list index out of range
. Received Model Group=ollama/deepseek-r1:14b
Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3254, in completion
response = base_llm_http_handler.completion(
model=model,
...<13 lines>...
client=client,
)
File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 330, in completion
data = provider_config.transform_request(
model=model,
...<3 lines>...
headers=headers,
)
File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 358, in transform_request
modified_prompt = ollama_pt(model=model, messages=messages)
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
tool_calls = messages[msg_i].get("tool_calls")
~~~~~~~~^^^^^^^
IndexError: list index out of range
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/proxy/proxy_server.py", line 4092, in chat_completion
result = await base_llm_response_processor.base_process_llm_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<16 lines>...
)
^
File "/usr/lib/python3.13/site-packages/litellm/proxy/common_request_processing.py", line 438, in base_process_llm_request
responses = await llm_responses
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1076, in acompletion
raise e
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1052, in acompletion
response = await self.async_function_with_fallbacks(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3895, in async_function_with_fallbacks
return await self.async_function_with_fallbacks_common_utils(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<8 lines>...
)
^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3853, in async_function_with_fallbacks_common_utils
raise original_exception
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3887, in async_function_with_fallbacks
response = await self.async_function_with_retries(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4092, in async_function_with_retries
raise original_exception
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3983, in async_function_with_retries
response = await self.make_call(original_function, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4101, in make_call
response = await response
^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1357, in _acompletion
raise e
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1309, in _acompletion
response = await _response
^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1598, in wrapper_async
raise e
File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1449, in wrapper_async
result = await original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 565, in acompletion
raise exception_type(
...<5 lines>...
)
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 538, in acompletion
init_response = await loop.run_in_executor(None, func_with_context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/concurrent/futures/thread.py", line 59, in run
result = self.fn(*self.args, **self.kwargs)
File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1072, in wrapper
result = original_function(*args, **kwargs)
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3582, in completion
raise exception_type(
~~~~~~~~~~~~~~^
model=model,
^^^^^^^^^^^^
...<3 lines>...
extra_kwargs=kwargs,
^^^^^^^^^^^^^^^^^^^^
)
^
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2301, in exception_type
raise e
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2277, in exception_type
raise APIConnectionError(
...<8 lines>...
)
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: list index out of range
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3254, in completion
response = base_llm_http_handler.completion(
model=model,
...<13 lines>...
client=client,
)
File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 330, in completion
data = provider_config.transform_request(
model=model,
...<3 lines>...
headers=headers,
)
File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 358, in transform_request
modified_prompt = ollama_pt(model=model, messages=messages)
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
tool_calls = messages[msg_i].get("tool_calls")
~~~~~~~~^^^^^^^
IndexError: list index out of range
. Received Model Group=ollama/deepseek-r1:14b
Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
INFO: 172.19.0.3:38404 - "POST /chat/completions HTTP/1.1" 500 Internal Server Error
21:18:51 - LiteLLM Proxy:ERROR: common_request_processing.py:644 - litellm.proxy.proxy_server._handle_llm_api_exception(): Exception occured - litellm.APIConnectionError: list index out of range
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3254, in completion
response = base_llm_http_handler.completion(
model=model,
...<13 lines>...
client=client,
)
File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 330, in completion
data = provider_config.transform_request(
model=model,
...<3 lines>...
headers=headers,
)
File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 358, in transform_request
modified_prompt = ollama_pt(model=model, messages=messages)
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
tool_calls = messages[msg_i].get("tool_calls")
~~~~~~~~^^^^^^^
IndexError: list index out of range
. Received Model Group=ollama/deepseek-r1:14b
Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3254, in completion
response = base_llm_http_handler.completion(
model=model,
...<13 lines>...
client=client,
)
File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 330, in completion
data = provider_config.transform_request(
model=model,
...<3 lines>...
headers=headers,
)
File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 358, in transform_request
modified_prompt = ollama_pt(model=model, messages=messages)
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
tool_calls = messages[msg_i].get("tool_calls")
~~~~~~~~^^^^^^^
IndexError: list index out of range
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/proxy/proxy_server.py", line 4092, in chat_completion
result = await base_llm_response_processor.base_process_llm_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<16 lines>...
)
^
File "/usr/lib/python3.13/site-packages/litellm/proxy/common_request_processing.py", line 438, in base_process_llm_request
responses = await llm_responses
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1076, in acompletion
raise e
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1052, in acompletion
response = await self.async_function_with_fallbacks(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3895, in async_function_with_fallbacks
return await self.async_function_with_fallbacks_common_utils(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<8 lines>...
)
^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3853, in async_function_with_fallbacks_common_utils
raise original_exception
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3887, in async_function_with_fallbacks
response = await self.async_function_with_retries(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4092, in async_function_with_retries
raise original_exception
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3983, in async_function_with_retries
response = await self.make_call(original_function, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4101, in make_call
response = await response
^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1357, in _acompletion
raise e
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1309, in _acompletion
response = await _response
^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1598, in wrapper_async
raise e
File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1449, in wrapper_async
result = await original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 565, in acompletion
raise exception_type(
...<5 lines>...
)
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 538, in acompletion
init_response = await loop.run_in_executor(None, func_with_context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/concurrent/futures/thread.py", line 59, in run
result = self.fn(*self.args, **self.kwargs)
File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1072, in wrapper
result = original_function(*args, **kwargs)
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3582, in completion
raise exception_type(
~~~~~~~~~~~~~~^
model=model,
^^^^^^^^^^^^
...<3 lines>...
extra_kwargs=kwargs,
^^^^^^^^^^^^^^^^^^^^
)
^
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2301, in exception_type
raise e
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2277, in exception_type
raise APIConnectionError(
...<8 lines>...
)
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: list index out of range
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3254, in completion
response = base_llm_http_handler.completion(
model=model,
...<13 lines>...
client=client,
)
File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 330, in completion
data = provider_config.transform_request(
model=model,
...<3 lines>...
headers=headers,
)
File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 358, in transform_request
modified_prompt = ollama_pt(model=model, messages=messages)
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
tool_calls = messages[msg_i].get("tool_calls")
~~~~~~~~^^^^^^^
IndexError: list index out of range
. Received Model Group=ollama/deepseek-r1:14b
Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
INFO: 172.19.0.3:56464 - "POST /chat/completions HTTP/1.1" 500 Internal Server Error
This is my yaml config:
litellm_settings:
drop_params: true
prompt_template: null
model_list:
- model_name: gpt-oss-20b
litellm_params:
model: openai/gpt-oss-20b
api_base: http://host.docker.internal:1234
api_key: ""
- model_name: deepseek-r1:14b
litellm_params:
model: ollama/deepseek-r1:14b
api_base: http://host.docker.internal:11434
api_key: "ollama"
Metadata
Metadata
Assignees
Labels
No labels