个人发展] llama2 部署实践(四)- llama 服务接口调用模式 - 3.langchain 调用
最编程
2024-03-14 11:04:47
...
from langchain.llms.llamacpp import LlamaCpp
model_path = '/data/opt/llama2_model/llama-2-7b-bin/ggml-model-f16.bin'
llm = LlamaCpp(model_path=model_path,verbose=False)
for s in llm.stream("write me a poem!"):
print(s,end="",flush=True)
上一篇: uniapp 隐藏状态栏并强制横向视图