Skip to content

sequence batcher coredump when client not send end flag #8391

@wuwenjunwwj

Description

@wuwenjunwwj

Description
when I deploy funasr streaming models with triton server,when client not send end flag,the triton server coredump 。https://github.com/modelscope/FunASR/tree/main/runtime/triton_gpu/model_repo_paraformer_large_online

Image

the coredump is:
#0 0x00007f41048bf4d1 in triton::core::InferenceRequest::LogRequestabi:cxx11 const () from /opt/tritonserver/lib/libtritonserver.so
[Current thread is 1 (Thread 0x7f3a17fff000 (LWP 30710))]
(gdb) bt
#0 0x00007f41048bf4d1 in triton::core::InferenceRequest::LogRequestabi:cxx11 const () from /opt/tritonserver/lib/libtritonserver.so
#1 0x00007f41049f4ecf in triton::core::OldestSequenceBatch::CompleteAndNext(unsigned int) () from /opt/tritonserver/lib/libtritonserver.so
#2 0x00007f41049f370b in triton::core::SequenceBatchScheduler::ReaperThread(int) () from /opt/tritonserver/lib/libtritonserver.so
#3 0x00007f4104403de4 in ?? () from /usr/lib/x86_64-linux-gnu/libstdc++.so.6
#4 0x00007f410c8aa609 in start_thread (arg=) at pthread_create.c:477
#5 0x00007f41040f0353 in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:95

Triton Information
What version of Triton are you using?
r23.04

Are you using the Triton container or did you build it yourself?

To Reproduce

Expected behavior
A clear and concise description of what you expected to happen.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions