# vllm **Repository Path**: vinojs/vllm ## Basic Information - **Project Name**: vllm - **Description**: 使用Vino开发的VLLM大语言模型框架 - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-08-05 - **Last Updated**: 2025-11-21 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # VLLM 这是一个`vino`的大语言模型客户端,支持`openai`和`ollama` ## 安装 ```bash vino install gitee.com/vinojs/vllm ``` ## 使用 ### 示例1 ```js import VLLM from "gitee.com/vinojs/vllm" /** * 基本对话 */ function openai() { let config1 = { apiKey: "sk-xxxxxxx", baseUrl: "https://dashscope.aliyuncs.com/compatible-mode/v1", type: "openai" } let v1 = new VLLM(config1) let res1 = v1.chat({ model: "qwen-turbo", messages: [{role: "user", content: "你好"}], }) console.log(res1); } /** * 基本对话 */ function ollama() { let config2 = { baseUrl: "http://127.0.0.1:11434", type: "ollama" } let v2 = new VLLM(config2) let res2 = v2.chat({ model: "qwen2.5:7b", messages: [{role: "user", content: "你好"}], }) console.log(res2); } ``` ### 示例2 ```js import VLLM from "gitee.com/vinojs/vllm" /** * 流式输出 */ function openai() { let config1 = { apiKey: "sk-xxxxxxx", baseUrl: "https://dashscope.aliyuncs.com/compatible-mode/v1", type: "openai" } let v1 = new VLLM(config1) let options = { model: "qwen-turbo", stream: true, messages: [ {role: "system", content: "你是AI机器人"}, {role: "user", content: "你是谁"} ] } v1.chat(options, (mes => { //流式回调 console.log(mes); })) } function ollama() { let config2 = { baseUrl: "http://127.0.0.1:11434", type: "ollama" } let v2 = new VLLM(config2) let options = { model: "qwen2.5:7b", stream: true, messages: [ {role: "system", content: "你是AI机器人"}, {role: "user", content: "你是谁"} ] } v2.chat(options, (mes => { //流式回调 console.log(mes); })) } ``` ### 示例3 ```js import VLLM from "gitee.com/vinojs/vllm" /** * 工具调用 */ let tools = [{ type: "function", function: { name: "getUser", description: "根据用户姓名来获取用户基本信息", parameters: { type: "object", properties: { name: { type: "string", description: "用户姓名" } }, required: ["name"] } } }, { type: "function", function: { "name": "get_current_time", "description": "当你想知道现在的时间时非常有用。", "parameters": {} } }] let config1 = { apiKey: "sk-xxxxxxx", baseUrl: "https://dashscope.aliyuncs.com/compatible-mode/v1", type: "openai" } let v1 = new VLLM(config1) let config2 = { baseUrl: "http://127.0.0.1:11434", type: "ollama" } let v2 = new VLLM(config2) function openai() { let res = v1.chat({ model: "qwen-turbo", tools: tools, messages: [{role: "user", content: "你是谁"}] }) console.log(res); let res2 = v1.chat({ model: "qwen-turbo", tools: tools, messages: [{role: "user", content: "张三的年龄是多少"}] }) console.log(res2); } function ollama() { let res = v2.chat({ model: "qwen2.5:7b", tools: tools, messages: [{role: "user", content: "现在几点了"}] }) console.log(res); let res2 = v2.chat({ model: "qwen2.5:7b", tools: tools, messages: [{role: "user", content: "张三的年龄是多少"}] }) console.log(res2); } ```