DeepSeek CoderModel sizePull DeepSeek CoderUse DeepSeek CoderRun DeepSeek CoderStart a conversationEnd the conversationReferences
Demo Environment
Development board: Jetson Orin series motherboard
SSD: 128G
Tutorial application scope: Whether the motherboard can run is related to the available memory of the system. The user's own environment and the programs running in the background may cause the model to fail to run
Motherboard model | Ollama | Open WebUI |
---|---|---|
Jetson Orin NX 16GB | √ | √ |
Jetson Orin NX 8GB | √ | √ |
Jetson Orin Nano 8GB | √ | √ |
Jetson Orin Nano 4GB | √(Need to run the small parameter version) | √(Need to run the small parameter version) |
DeepSeek Coder is an open source large language model (LLM) designed by DeepSeek to understand and generate code.
Model | Parameters |
---|---|
DeepSeek Coder | 1.3B |
DeepSeek Coder | 6.7B |
DeepSeek Coder | 33B |
Using the pull command will automatically pull the model from the Ollama model library:
xxxxxxxxxx
ollama pull deepseek-coder:6.7b
Model with small parameters: motherboards with 8G or less memory can run this
xxxxxxxxxx
ollama pull deepseek-coder:1.3b
If the system does not have a running model, the system will automatically pull the DeepSeek Coder 6.7B model and run it:
xxxxxxxxxx
ollama run deepseek-coder:6.7b
Model with small parameters: motherboards with 8G or less memory can run this
xxxxxxxxxx
ollama run deepseek-coder:1.3b
xxxxxxxxxx
print HelloWorld in python
The time to reply to the question depends on the hardware configuration, please be patient!
Use the Ctrl+d
shortcut key or /bye
to end the conversation!
Ollama
Official website: https://ollama.com/
GitHub: https://github.com/ollama/ollama
DeepSeek Coder
Ollama corresponding model: https://ollama.com/library/deepseek-coder