-
Notifications
You must be signed in to change notification settings - Fork 152
Open
Description
So i want to send some audio stream to my fastapi server which processes it and sends it back to the client. Its pretty straight forward to build a streaming endpoint in FastAPI with:
@app.websocket("/audio-stream/")
async def audio_stream_endpoint(websocket: WebSocket):
await websocket.accept()
try:
while True:
# Receive raw audio data from the client
audio_data = await websocket.receive_bytes()
# Process audio data
pa = streaming_service.process_stream()
# Send processed audio data back to the client
await websocket.send_bytes(pa)
except Exception as e:
print("Connection closed:", e)
finally:
await websocket.close()
However I wondered if it makes sense to use sounddevice streams in this scenario to take advantage of the optimizations which are certainly there in comparison to my naive implementation? The question would be how to implement it. Use sounddevice on the client side and send the stream through fastapi as bytes and the take the audio_data from my endpoint and convert it again to an sounddevice stream to do processing and stuff.
Metadata
Metadata
Assignees
Labels
No labels