乐闻世界logo
搜索文章和话题

如何在 Python、JavaScript 等编程语言中集成 Ollama?

2月19日 19:50

Ollama 可以轻松集成到各种编程语言和框架中:

Python 集成:

使用 ollama Python 库:

python
import ollama # 生成文本 response = ollama.generate(model='llama3.1', prompt='Hello, how are you?') print(response['response']) # 对话 messages = [ {'role': 'user', 'content': 'Hello!'}, {'role': 'assistant', 'content': 'Hi there!'}, {'role': 'user', 'content': 'How are you?'} ] response = ollama.chat(model='llama3.1', messages=messages) print(response['message']['content']) # 流式响应 for chunk in ollama.generate(model='llama3.1', prompt='Tell me a story', stream=True): print(chunk['response'], end='', flush=True)

JavaScript/Node.js 集成:

使用 ollama-js 库:

javascript
import ollama from 'ollama-js' const client = new ollama.Ollama() // 生成文本 const response = await client.generate({ model: 'llama3.1', prompt: 'Hello, how are you?' }) console.log(response.response) // 对话 const chat = await client.chat({ model: 'llama3.1', messages: [ { role: 'user', content: 'Hello!' }, { role: 'assistant', content: 'Hi there!' }, { role: 'user', content: 'How are you?' } ] }) console.log(chat.message.content)

Go 集成:

go
package main import ( "bytes" "encoding/json" "fmt" "net/http" ) type GenerateRequest struct { Model string `json:"model"` Prompt string `json:"prompt"` } type GenerateResponse struct { Response string `json:"response"` } func main() { req := GenerateRequest{ Model: "llama3.1", Prompt: "Hello, how are you?", } body, _ := json.Marshal(req) resp, _ := http.Post("http://localhost:11434/api/generate", "application/json", bytes.NewBuffer(body)) var result GenerateResponse json.NewDecoder(resp.Body).Decode(&result) fmt.Println(result.Response) }

LangChain 集成:

python
from langchain_community.llms import Ollama llm = Ollama(model="llama3.1") # 简单调用 response = llm.invoke("Tell me a joke") print(response) # 链式调用 from langchain.prompts import ChatPromptTemplate from langchain.schema import StrOutputParser prompt = ChatPromptTemplate.from_template("Tell me a {adjective} joke about {topic}") chain = prompt | llm | StrOutputParser() print(chain.invoke({"adjective": "funny", "topic": "programming"}))

REST API 集成:

任何支持 HTTP 的语言都可以直接调用 REST API:

bash
curl http://localhost:11434/api/generate -d '{ "model": "llama3.1", "prompt": "Hello, how are you?", "stream": false }'
标签:Ollama