大家好,我是超哥。这一期我教大家在本地部署千问开源大模型:Qwen3.6-35B-A3B,这款模型在编程方面表现非常优秀。我会搭配 OpenClaw + Ollama,带你一步步完成部署,下面我们开始教程。

一、安装ollama
下载ollama地址:https://ollama.com/download/windows
二、下载AI模型
Qwen3.6-35B-A3B官方介绍地址:https://qwen.ai/blog?id=qwen3.6-35b-a3b
模型列表:https://ollama.com/fredrezones55/Qwen3.6-35B-A3B-Uncensored-HauhauCS-Aggressive
| 模型 | 量化版本 | 模型体积 | 推理显存需求 | 推荐显卡 |
|---|---|---|---|---|
| Qwen3.6-35B-A3B | IQ2_M | 13GB | 12~16GB | 3060 / 4060 / 3080 |
| Qwen3.6-35B-A3B | Q2_K_P | 16GB | 14~18GB | 3080 / 3090 / 4060Ti |
| Qwen3.6-35B-A3B | Q4_XS | 20GB | 16~20GB | 3080 / 3090 / 4070 |
| Qwen3.6-35B-A3B | Q4 | 22GB | 16~20GB(可跑) | 3090 / 4090 |
电脑里打开Powershell,执行下面下载命令。
IQ2_M下载模型命令:
ollama pull fredrezones55/Qwen3.6-35B-A3B-Uncensored-HauhauCS-Aggressive:IQ2_M
Q2_K_P下载模型命令:
ollama pull fredrezones55/Qwen3.6-35B-A3B-Uncensored-HauhauCS-Aggressive:Q2_K_P
IQ4_XS下载模型命令:
ollama pull fredrezones55/Qwen3.6-35B-A3B-Uncensored-HauhauCS-Aggressive:IQ4_XS
Q4下载模型命令:
ollama pull fredrezones55/Qwen3.6-35B-A3B-Uncensored-HauhauCS-Aggressive:Q4
三、安装基础环境
1. 安装python
https://www.python.org/downloads/
https://www.python.org/ftp/python/pymanager/python-manager-26.1.msix
2. 安装Node.js
https://nodejs.org/en/download
3. 安装git
https://git-scm.com/install/windows
https://github.com/git-for-windows/git/releases/download/v2.53.0.windows.3/Git-2.53.0.3-64-bit.exe
四、安装配置openclaw
安装openclaw命令:
powershell -c "irm https://openclaw.ai/install.ps1 | iex"新开一个powershell,执行启动网关命令:
openclaw gateway
五、接入Telegram
执行openclaw配置命令:
openclaw onboard
openclaw常用命令
进入终端聊天命令:
openclaw tui
获取带令牌的仪表盘URL命令:
openclaw dashboard
重新配置命令:
openclaw onboard
