LangChain Typescript 简单示例¶
此示例展示了使用 Node.js 和 Typescript 将 LangChain 与 Ollama 结合使用的基本“hello world”。
运行示例¶
1. 安装先决条件:
2. 确保 mistral
模型可用:
3. 运行示例:
源码¶
main.py¶
import { Ollama } from 'langchain/llms/ollama';
import * as readline from "readline";
async function main() {
const ollama = new Ollama({
model: 'mistral'
// other parameters can be found at https://js.langchain.com/docs/api/llms_ollama/classes/Ollama
});
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
});
rl.question("What is your question: \n", async (user_input) => {
const stream = await ollama.stream(user_input);
for await (const chunk of stream) {
process.stdout.write(chunk);
}
rl.close();
})
}
main();