启动器.ipynb 3.5 KB
Newer Older
chenpangpang's avatar
chenpangpang committed
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "e5c5a211-2ccd-4341-af10-ac546484b91f",
   "metadata": {
    "tags": []
   },
   "source": [
    "## 说明\n",
    "\n",
    "- 启动和重启 Notebook 点上方工具栏中的「重启并运行所有单元格」,出现`Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)`就成功,可以开启页面了\n",
    "- 通过以下方式开启页面:\n",
    "    - 控制台打开「自定义服务」了,访问自定义服务端口号设置为8080\n",
    "- 账号密码:<br>\n",
    "    <img src=\"./assets/账号.png\" style=\"width:50%;\" alt=\"描述文字\">\n",
    "- 页面操作:\n",
    "    ![demo](./assets/示例.jpeg)\n",
    "- 下载模型:新建启动页,打开终端输入`Download`中命令,然后刷新通过「自定义服务」的页面即可使用\n",
    "| Model              | Parameters | Size  | Download                       |\n",
    "| ------------------ | ---------- | ----- | ------------------------------ |\n",
    "| Llama 3.1          | 8B         | 4.7GB | `ollama pull llama3.1`          |\n",
    "| Llama 3.1          | 70B        | 40GB  | `ollama pull llama3.1:70b`      |\n",
    "| Llama 3.1          | 405B       | 231GB | `ollama pull llama3.1:405b`     |\n",
    "| Phi 3 Mini         | 3.8B       | 2.3GB | `ollama pull phi3`              |\n",
    "| Phi 3 Medium       | 14B        | 7.9GB | `ollama pull phi3:medium`       |\n",
    "| Gemma 2            | 2B         | 1.6GB | `ollama pull gemma2:2b`         |\n",
    "| Gemma 2            | 9B         | 5.5GB | `ollama pull gemma2`            |\n",
    "| Gemma 2            | 27B        | 16GB  | `ollama pull gemma2:27b`        |\n",
    "| Mistral            | 7B         | 4.1GB | `ollama pull mistral`           |\n",
    "| Moondream 2        | 1.4B       | 829MB | `ollama pull moondream`         |\n",
    "| Neural Chat        | 7B         | 4.1GB | `ollama pull neural-chat`       |\n",
    "| Starling           | 7B         | 4.1GB | `ollama pull starling-lm`       |\n",
    "| Code Llama         | 7B         | 3.8GB | `ollama pull codellama`         |\n",
    "| Llama 2 Uncensored | 7B         | 3.8GB | `ollama pull llama2-uncensored` |\n",
    "| LLaVA              | 7B         | 4.5GB | `ollama pull llava`             |\n",
    "| Solar              | 10.7B      | 6.1GB | `ollama pull solar`             |\n",
    "- 其他高级操作参考:\n",
    "    - https://github.com/ollama/ollama\n",
    "    - https://github.com/open-webui/open-webui\n",
    "## 功能介绍\n",
    "- 原项目地址:https://github.com/ollama/ollama,https://github.com/open-webui/open-webui\n",
    "- ollama-webui:一款带有webui的大模型聊天工具\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "9e81ae9d-3a34-43a0-943a-ff5e9d6ce961",
   "metadata": {},
   "outputs": [],
   "source": [
    "# 启动\n",
    "!sh start.sh"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "9c881a9d-351b-4d76-b1ef-7b0e2f1910d6",
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.10.13"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}