Skip to content

[🐛 Bug]: Create Incidents With AI Function Fails with Connection Error(Reopened After Accidental Closure) #5429

@kc945752357

Description

@kc945752357

issue:#5428
This issue is reopened after being accidentally closed due to a misclick—the original problem persists: the "Create Incidents With AI" function consistently fails with a connection error when calling the /chat/completions interface, while all other AI features in the system work normally.

Thanks for the guided troubleshooting! Just to confirm—we’ve already verified the key environment variables are properly configured (screenshot attached for reference), but the issue persists.
The OPENAI_BASE_URL includes the /v1 suffix, and OPENAI_API_KEY, LITELLM_SALT_KEY, and LITELLM_VIRTUAL_KEY are consistent with other working AI modules across frontend and backend containers.
We’ll now focus on comparing the payload structure/size between the failing Create Incidents With AI function and working features, as well as testing the exact payload via curl. We’ll also check LiteLLM logs for payload-specific errors that might not show up in basic connectivity tests.
Let us know if there’s any additional context we should provide for further debugging!
(Note: Screenshot of the configuration file is attached, showing the correctly set OPENAI_BASE_URL, OPENAI_API_KEY,related variables.)

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    BugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions