瑕佽繍琛 `ollama serve --host 127.0.0.1` 鍛戒护锛屼綘闇瑕佺‘淇濆凡缁忓畨瑁呭苟閰嶇疆浜 Ollama 宸ュ叿锛屽苟涓斾綘鐨勬ā鍨嬫枃浠跺凡缁忓噯澶囧ソ銆備互
涓嬫槸璇︾粏鐨勬楠ゅ拰璇存槑锛
### 瀹夎 Ollama
1. **瀹夎 Python**锛
纭繚浣犲凡缁忓畨瑁呬簡 Python 3銆備綘鍙互閫氳繃浠ヤ笅鍛戒护妫鏌ユ槸鍚﹀凡瀹夎锛
```sh
python3 --version
```
濡傛灉鏈畨瑁咃紝鍙互浠 [Python 瀹樻柟缃戠珯](https://www.python.org/) 涓嬭浇骞跺畨瑁呫
2. **瀹夎 Ollama**锛
浣跨敤 pip 瀹夎 Ollama 宸ュ叿锛
```sh
pip install ollama
```
### 鍑嗗妯″瀷鏂囦欢
纭繚浣犳湁涓涓湁鏁堢殑 AI 妯″瀷鏂囦欢锛屽苟涓斿畠宸茬粡姝g‘鏀剧疆鍦ㄦ寚瀹氱洰褰曚腑銆備緥濡傦紝鍋囪浣犵殑妯″瀷鏂囦欢涓 `model.pt` 骞朵綅浜
`/path/to/models/` 鐩綍涓嬨
### 杩愯鍛戒护
1. **瀵艰埅鍒版ā鍨嬫枃浠舵墍鍦ㄧ洰褰**锛
```sh
cd /path/to/models/
```
2. **杩愯 Ollama 鏈嶅姟**锛
浣跨敤浠ヤ笅鍛戒护鍚姩 Ollama 鏈嶅姟骞舵寚瀹氫富鏈哄湴鍧涓 `127.0.0.1`锛堟湰鍦板洖鐜湴鍧锛夛細
```sh
ollama serve --host 127.0.0.1
```
### 绀轰緥
鍋囪浣犵殑妯″瀷鏂囦欢浣嶄簬 `/home/user/models/` 鐩綍涓嬶紝浣犲彲浠ユ寜鐓т互涓嬫楠ゆ搷浣滐細
```sh
# 瀵艰埅鍒版ā鍨嬫枃浠舵墍鍦ㄧ洰褰
cd /home/user/models/
# 杩愯 Ollama 鏈嶅姟
ollama serve --host 127.0.0.1
```
### 楠岃瘉鏈嶅姟鏄惁杩愯
浣犲彲浠ュ湪鍙︿竴缁堢绐楀彛涓娇鐢 `curl` 鎴栧叾浠 HTTP 瀹㈡埛绔潵楠岃瘉鏈嶅姟鏄惁姝e父杩愯銆備緥濡傦細
```sh
# 浣跨敤 curl 鍛戒护娴嬭瘯 API
curl http://127.0.0.1:8080/predict -d "text=Hello, Ollama!"
```
### 娉ㄦ剰浜嬮」
- **绔彛閰嶇疆**锛氶粯璁ゆ儏鍐典笅锛孫llama 鏈嶅姟鐩戝惉鐨勭鍙f槸 `8080`銆傚鏋滀綘闇瑕佹洿鏀圭鍙e彿锛岃鍦ㄥ懡浠や腑鎸囧畾锛
```sh
ollama serve --host 127.0.0.1 --port 8081
```
- **妯″瀷鏂囦欢璺緞**锛氱‘淇濅綘鐨勬ā鍨嬫枃浠舵纭斁缃紝骞朵笖 Ollama 鑳藉鎵惧埌瀹冦傚鏋滈渶瑕佹寚瀹氬叿浣撶殑妯″瀷鏂囦欢浣嶇疆锛屽彲浠ュ湪鍛戒护涓坊
鍔 `--model-path` 鍙傛暟锛
```sh
ollama serve --host 127.0.0.1 --model-path model.pt
```
閫氳繃浠ヤ笂姝ラ锛屼綘搴旇鑳藉鎴愬姛杩愯 Ollama 鏈嶅姟骞剁‘淇濆畠鍦ㄦ湰鍦板洖鐜湴鍧 `127.0.0.1` 涓婃甯稿伐浣溿傚鏋滄湁浠讳綍闂锛岃妫鏌ユ棩
蹇楁垨閿欒淇℃伅浠ヨ幏鍙栨洿澶氫俊鎭
浠ヤ笂灏辨槸鍏抽棴Ollama鏈湴鍥炵幆鍦板潃ollama serve --host 127.0.0.1鐨勮缁嗗唴瀹癸紝鏇村淇℃伅璇峰叧娉∣D浜戝叾瀹冪浉鍏虫枃绔狅紒