Skip to content

[Bug]ChatQnA fails on Xeon v1.3 v.1.4 #2237

@mkarvir

Description

@mkarvir

Priority

Undecided

OS type

Ubuntu

Hardware type

Xeon-SPR

Installation method

  • Pull docker images from hub.docker.com
  • Build docker images from source
  • Other
  • N/A

Deploy method

  • Docker
  • Docker Compose
  • Kubernetes Helm Charts
  • Kubernetes GMC
  • Other
  • N/A

Running nodes

Single Node

What's the version?

issues with version 1.3 & 1.4
works for 1.2

Description

Deployed on IBM Cloud. Followed instructions listed at - https://opea-project.github.io/latest/getting-started/README.html
v1.2 deployed ok
v1.3 vllm-service kept stalling in waiting state. docker logs vllm-service not significant.
$ docker logs tei-embedding-server
2025-09-03T20:13:02.927187Z INFO text_embeddings_router: router/src/main.rs:175: Args { model_id: "BAA*/-*--v1.5", revision: None, tokenization_workers: None, dtype: None, pooling: None, max_concurrent_requests: 512, max_batch_tokens: 16384, max_batch_requests: None, max_client_batch_size: 32, auto_truncate: true, default_prompt_name: None, default_prompt: None, hf_api_token: None, hostname: "***", port: 80, uds_path: "/tmp/text-embeddings-inference-server", huggingface_hub_cache: Some("/data"), payload_limit: 2000000, api_key: None, json_output: false, otlp_endpoint: None, otlp_service_name: "text-embeddings-inference.server", cors_allow_origin: None }
2025-09-03T20:13:02.927297Z INFO hf_hub: /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/hf-hub-0.3.2/src/lib.rs:55: Token file not found "/root/.cache/huggingface/token"
2025-09-03T20:13:02.969056Z INFO download_pool_config: text_embeddings_core::download: core/src/download.rs:38: Downloading 1_Pooling/config.json
2025-09-03T20:13:03.334494Z INFO download_new_st_config: text_embeddings_core::download: core/src/download.rs:62: Downloading config_sentence_transformers.json
2025-09-03T20:13:03.377005Z INFO download_artifacts: text_embeddings_core::download: core/src/download.rs:21: Starting download
2025-09-03T20:13:03.377013Z INFO download_artifacts: text_embeddings_core::download: core/src/download.rs:23: Downloading config.json
Error: Could not download model artifacts

Caused by:
0: request error: builder error: relative URL without a base
1: builder error: relative URL without a base
2: relative URL without a base

docker logs tei-reranking-server
2025-09-03T20:13:02.872455Z INFO text_embeddings_router: router/src/main.rs:175: Args { model_id: "BAA*/--ase", revision: None, tokenization_workers: None, dtype: None, pooling: None, max_concurrent_requests: 512, max_batch_tokens: 16384, max_batch_requests: None, max_client_batch_size: 32, auto_truncate: true, default_prompt_name: None, default_prompt: None, hf_api_token: None, hostname: "", port: 80, uds_path: "/tmp/text-embeddings-inference-server", huggingface_hub_cache: Some("/data"), payload_limit: 2000000, api_key: None, json_output: false, otlp_endpoint: None, otlp_service_name: "text-embeddings-inference.server", cors_allow_origin: None }
2025-09-03T20:13:02.872552Z INFO hf_hub: /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/hf-hub-0.3.2/src/lib.rs:55: Token file not found "/root/.cache/huggingface/token"
2025-09-03T20:13:02.915969Z INFO download_pool_config: text_embeddings_core::download: core/src/download.rs:38: Downloading 1_Pooling/config.json
2025-09-03T20:13:03.295591Z INFO download_new_st_config: text_embeddings_core::download: core/src/download.rs:62: Downloading config_sentence_transformers.json
2025-09-03T20:13:03.338156Z INFO download_artifacts: text_embeddings_core::download: core/src/download.rs:21: Starting download
2025-09-03T20:13:03.338166Z INFO download_artifacts: text_embeddings_core::download: core/src/download.rs:23: Downloading config.json
Error: Could not download model artifacts

Caused by:
0: request error: builder error: relative URL without a base
1: builder error: relative URL without a base
2: relative URL without a base

v1.4
✔ Network xeon_default Created 0.0s
✔ Container tei-reranking-server Started 0.6s
✔ Container redis-vector-db Healthy 5.5s
✔ Container tei-embedding-server Started 0.5s
✘ Container vllm-service Error 15.0s
✔ Container retriever-redis-server Started 0.6s
✘ Container dataprep-redis-server Error 37.1s
✔ Container chatqna-xeon-backend-server Created 0.0s
✔ Container chatqna-xeon-ui-server Created 0.0s
✔ Container chatqna-xeon-nginx-server Created 0.0s
dependency failed to start: container vllm-service exited (1)

Docker logs vllm-service indicates issues with downloading model from HuggingFase.
Please note $HUGGINGFACEHUB_API_TOKEN was set correctly. confirmed with echo $HUGGINGFACEHUB_API_TOKEN

Reproduce steps

Follow the instructions of deployment for IBM Cloud - https://opea-project.github.io/latest/getting-started/README.html
export RELEASE_VERSION=1.3
Clean up
docker stop $(docker ps -q)
docker rm $(docker ps -a -q)
docker system prune --all --volumes --force
rm -rf ~/.cache/*
sudo rm -rf /var/cache/apt/archives/*

Repeat for release 1.4

Raw log

v1.3 vllm-service kept stalling in waiting state. docker logs vllm-service not significant.
$ docker logs tei-embedding-server
2025-09-03T20:13:02.927187Z  INFO text_embeddings_router: router/src/main.rs:175: Args { model_id: "BAA*/***-****-**-v1.5", revision: None, tokenization_workers: None, dtype: None, pooling: None, max_concurrent_requests: 512, max_batch_tokens: 16384, max_batch_requests: None, max_client_batch_size: 32, auto_truncate: true, default_prompt_name: None, default_prompt: None, hf_api_token: None, hostname: "*****", port: 80, uds_path: "/tmp/text-embeddings-inference-server", huggingface_hub_cache: Some("/data"), payload_limit: 2000000, api_key: None, json_output: false, otlp_endpoint: None, otlp_service_name: "text-embeddings-inference.server", cors_allow_origin: None }
2025-09-03T20:13:02.927297Z  INFO hf_hub: /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/hf-hub-0.3.2/src/lib.rs:55: Token file not found "/root/.cache/huggingface/token"    
2025-09-03T20:13:02.969056Z  INFO download_pool_config: text_embeddings_core::download: core/src/download.rs:38: Downloading `1_Pooling/config.json`
2025-09-03T20:13:03.334494Z  INFO download_new_st_config: text_embeddings_core::download: core/src/download.rs:62: Downloading `config_sentence_transformers.json`
2025-09-03T20:13:03.377005Z  INFO download_artifacts: text_embeddings_core::download: core/src/download.rs:21: Starting download
2025-09-03T20:13:03.377013Z  INFO download_artifacts: text_embeddings_core::download: core/src/download.rs:23: Downloading `config.json`
Error: Could not download model artifacts

Caused by:
    0: request error: builder error: relative URL without a base
    1: builder error: relative URL without a base
    2: relative URL without a base

docker logs tei-reranking-server
2025-09-03T20:13:02.872455Z  INFO text_embeddings_router: router/src/main.rs:175: Args { model_id: "BAA*/***-********-*ase", revision: None, tokenization_workers: None, dtype: None, pooling: None, max_concurrent_requests: 512, max_batch_tokens: 16384, max_batch_requests: None, max_client_batch_size: 32, auto_truncate: true, default_prompt_name: None, default_prompt: None, hf_api_token: None, hostname: "******", port: 80, uds_path: "/tmp/text-embeddings-inference-server", huggingface_hub_cache: Some("/data"), payload_limit: 2000000, api_key: None, json_output: false, otlp_endpoint: None, otlp_service_name: "text-embeddings-inference.server", cors_allow_origin: None }
2025-09-03T20:13:02.872552Z  INFO hf_hub: /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/hf-hub-0.3.2/src/lib.rs:55: Token file not found "/root/.cache/huggingface/token"    
2025-09-03T20:13:02.915969Z  INFO download_pool_config: text_embeddings_core::download: core/src/download.rs:38: Downloading `1_Pooling/config.json`
2025-09-03T20:13:03.295591Z  INFO download_new_st_config: text_embeddings_core::download: core/src/download.rs:62: Downloading `config_sentence_transformers.json`
2025-09-03T20:13:03.338156Z  INFO download_artifacts: text_embeddings_core::download: core/src/download.rs:21: Starting download
2025-09-03T20:13:03.338166Z  INFO download_artifacts: text_embeddings_core::download: core/src/download.rs:23: Downloading `config.json`
Error: Could not download model artifacts

Caused by:
    0: request error: builder error: relative URL without a base
    1: builder error: relative URL without a base
    2: relative URL without a base

v1.4

Please note $HUGGINGFACEHUB_API_TOKEN was set correctly. confirmed with echo $HUGGINGFACEHUB_API_TOKEN

docker logs vllm-service
INFO 09-03 21:20:45 [__init__.py:235] Automatically detected platform cpu.
WARNING 09-03 21:20:47 [_logger.py:72] Torch Profiler is enabled in the API server. This should ONLY be used for local development!
INFO 09-03 21:20:47 [api_server.py:1755] vLLM API server version 0.10.0
INFO 09-03 21:20:47 [cli_args.py:261] non-default args: {'host': '0.0.0.0', 'port': 80, 'model': 'meta-llama/Meta-Llama-3-8B-Instruct'}
Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/huggingface_hub/utils/_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "/opt/venv/lib/python3.12/site-packages/requests/models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/transformers/utils/hub.py", line 479, in cached_files
    hf_hub_download(
  File "/opt/venv/lib/python3.12/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1010, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1117, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "/opt/venv/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1658, in _raise_on_head_call_error
    raise head_call_error
  File "/opt/venv/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1546, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
               ^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 1463, in get_hf_file_metadata
    r = _request_wrapper(
        ^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
               ^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/huggingface_hub/file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "/opt/venv/lib/python3.12/site-packages/huggingface_hub/utils/_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-68b8b130-3a3cb54452fc7d643ed8cdbb;c23d22d2-0c28-4031-9dbd-0fd66f3b6685)

Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct/resolve/main/config.json.
Access to model meta-llama/Meta-Llama-3-8B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/opt/venv/lib/python3.12/site-packages/vllm/entrypoints/openai/api_server.py", line 1856, in <module>
    uvloop.run(run_server(args))
  File "/opt/venv/lib/python3.12/site-packages/uvloop/__init__.py", line 109, in run
    return __asyncio.run(
           ^^^^^^^^^^^^^^
  File "/opt/uv/python/cpython-3.12.11-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 195, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/opt/uv/python/cpython-3.12.11-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "uvloop/loop.pyx", line 1518, in uvloop.loop.Loop.run_until_complete
  File "/opt/venv/lib/python3.12/site-packages/uvloop/__init__.py", line 61, in wrapper
    return await main
           ^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/vllm/entrypoints/openai/api_server.py", line 1791, in run_server
    await run_server_worker(listen_address, sock, args, **uvicorn_kwargs)
  File "/opt/venv/lib/python3.12/site-packages/vllm/entrypoints/openai/api_server.py", line 1811, in run_server_worker
    async with build_async_engine_client(args, client_config) as engine_client:
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/uv/python/cpython-3.12.11-linux-x86_64-gnu/lib/python3.12/contextlib.py", line 210, in __aenter__
    return await anext(self.gen)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/vllm/entrypoints/openai/api_server.py", line 158, in build_async_engine_client
    async with build_async_engine_client_from_engine_args(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/uv/python/cpython-3.12.11-linux-x86_64-gnu/lib/python3.12/contextlib.py", line 210, in __aenter__
    return await anext(self.gen)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/vllm/entrypoints/openai/api_server.py", line 180, in build_async_engine_client_from_engine_args
    vllm_config = engine_args.create_engine_config(usage_context=usage_context)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/vllm/engine/arg_utils.py", line 1004, in create_engine_config
    model_config = self.create_model_config()
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/vllm/engine/arg_utils.py", line 872, in create_model_config
    return ModelConfig(
           ^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/pydantic/_internal/_dataclasses.py", line 123, in __init__
    s.__pydantic_validator__.validate_python(ArgsKwargs(args, kwargs), self_instance=s)
  File "/opt/venv/lib/python3.12/site-packages/vllm/config.py", line 544, in __post_init__
    hf_config = get_config(self.hf_config_path or self.model,
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/vllm/transformers_utils/config.py", line 355, in get_config
    config_dict, _ = PretrainedConfig.get_config_dict(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/transformers/configuration_utils.py", line 649, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/transformers/configuration_utils.py", line 708, in _get_config_dict
    resolved_config_file = cached_file(
                           ^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/transformers/utils/hub.py", line 321, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/transformers/utils/hub.py", line 543, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct.
401 Client Error. (Request ID: Root=1-68b8b130-3a3cb54452fc7d643ed8cdbb;c23d22d2-0c28-4031-9dbd-0fd66f3b6685)

Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct/resolve/main/config.json.
Access to model meta-llama/Meta-Llama-3-8B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

docker logs dataprep-redis-server 
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing LLMChain from langchain root module is no longer supported. Please use langchain.chains.LLMChain instead.
  warnings.warn(
/usr/local/lib/python3.11/site-packages/langchain/__init__.py:30: UserWarning: Importing PromptTemplate from langchain root module is no longer supported. Please use langchain_core.prompts.PromptTemplate instead.
  warnings.warn(
Traceback (most recent call last):
  File "/home/user/comps/dataprep/src/opea_dataprep_microservice.py", line 46, in <module>
    loader = OpeaDataprepLoader(
             ^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/opea_dataprep_loader.py", line 15, in __init__
    super().__init__(component_name=component_name, **kwargs)
  File "/home/user/comps/cores/common/component.py", line 152, in __init__
    self.component = component_class(**kwargs)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 315, in __init__
    self.embedder = asyncio.run(self._initialize_embedder())
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 30, in run
    return loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
           ^^^^^^^^^^
  File "/usr/local/lib/python3.11/asyncio/futures.py", line 203, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 277, in __step
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/home/user/comps/dataprep/src/integrations/redis.py", line 337, in _initialize_embedder
    raise HTTPException(
fastapi.exceptions.HTTPException: 400: You MUST offer the `HF_TOKEN` when using `TEI_EMBEDDING_ENDPOINT`.

Attachments

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions