Phi-3Model sizePerformancePull Phi-3Using Phi-3Run Phi-3Have a conversationEnd the conversationReferences
Demo Environment
Development board: Jetson Orin series motherboard
SSD: 128G
Tutorial application scope: Whether the motherboard can run is related to the available memory of the system. The user's own environment and the programs running in the background may cause the model to fail to run
Motherboard model | Ollama | Open WebUI |
---|---|---|
Jetson Orin NX 16GB | √ | √ |
Jetson Orin NX 8GB | √ | √ |
Jetson Orin Nano 8GB | √ | √ |
Jetson Orin Nano 4GB | √ | √ |
Phi-3 is a powerful, cost-effective Small Language Model (SLM) from Microsoft that outperforms models of the same and higher sizes across a variety of language, reasoning, encoding, and math benchmarks.
Model | Parameters |
---|---|
Phi-3 (Mini) | 3.8B |
Phi-3 (Medium) | 14B |
Using the pull command will automatically pull the model from the Ollama model library:
xxxxxxxxxx
ollama pull phi3:3.8b
If the system does not have a running model, the system will automatically pull the Phi-3 3.8B model and run it:
xxxxxxxxxx
ollama run phi3:3.8b
xxxxxxxxxx
Tell me a bedtime story
The time to reply to the question depends on the hardware configuration, so be patient!
Use the Ctrl+d
shortcut key or /bye
to end the conversation!
Ollama
Official website: https://ollama.com/
GitHub: https://github.com/ollama/ollama
Phi-3
Ollama corresponding model: https://ollama.com/library/phi3