quickstart boilerplate for langchainjs/langgraphjs
- 
examples(mostly with langgraph)
- llm with groq api
 - ✨ llm with local llm, tested with LM Studio
 - tool call: tavily api
 - structured output
 
 - 
RAG
- simple
 - generate_query_or_respond
 - memory: chat history
 - docs grading
 
 
npm i
# Option 1: use local llm, configure the `baseURL` in code then run
npx tsx ./langchain/chain-groq1-chat-local-mini.ts
# Option 2: use groq api, configure the `GROQ_API_KEY` first
cp .env.example .env
npx tsx ./langchain/chain-groq1-chat-starter.ts-  
graph.streamnot work with local llm 
- 
examples in python: https://github.com/uptonking/langchain-langgraph-play
 - 
why does rag return only one relavent doc?
- changing embedding model may help
 
 
MIT