跳转至

LangChain 网页摘要

此示例摘要网站 https://ollama.com/blog/run-llama2-uncensored-locally

运行示例

1. 确保您已安装 llama2 模型:

ollama pull llama2

2. 安装 Python 需求。

pip install -r requirements.txt

3. 运行示例:

python main.py

源码

main.py

from langchain.llms import Ollama
from langchain.document_loaders import WebBaseLoader
from langchain.chains.summarize import load_summarize_chain

loader = WebBaseLoader("https://ollama.com/blog/run-llama2-uncensored-locally")
docs = loader.load()

llm = Ollama(model="llama2")
chain = load_summarize_chain(llm, chain_type="stuff")

result = chain.run(docs)
print(result)

仓库地址

langchain-python-rag-websummary