-
Notifications
You must be signed in to change notification settings - Fork 67
Open
Description
Discussed in #1150
Originally posted by nilooy June 25, 2023
is it possible to use quirrel queue with vercel edge function? i was looking specifically for this to run as background job by quirrel
https://github.com/inngest/vercel-ai-sdk/blob/main/examples/next-openai/app/api/chat/route.ts
i tried the following approach
import { Queue as TestQueue } from "quirrel/next";
import { Configuration, OpenAIApi } from "openai-edge";
import { OpenAIStream, StreamingTextResponse } from "ai";
export const runtime = "edge";
const config = new Configuration({
apiKey: process.env.OPENAI_API_KEY,
});
const openai = new OpenAIApi(config);
// @ts-ignore
export default TestQueue("api/test", async (params) => {
const response = await openai.createChatCompletion({
model: "gpt-3.5-turbo",
stream: true,
messages: [{ role: "user", content: "explain the next js" }],
});
const stream = OpenAIStream(response);
// Respond with the stream
return new StreamingTextResponse(stream);
});
and ran from another route
await TestQueue.enqueue({ test: 123 });
this results in following error while running
👟Executing job
queue: /api/test
id: 7f0226c0-4824-4671-9efa-e926484e95ae
body: {"test":123}
error - node_modules/quirrel/dist/esm/src/client/enhanced-json.js (13:0) @ Module.parse
error - Unexpected token o in JSON at position 1
null
WORKED PERFECTLY WITHOUT
edge
Metadata
Metadata
Assignees
Labels
No labels