Phi-3 modelModel scalePerformance performanceGot Phi-3Use Phi-3Run Phi-3DialogueEnd conversationReference material
Demonstration environment
Development Board: Raspberry Pi 5B
SD(TF)card:64G(Above 16G, the larger the capacity, the more models can be experienced)
xRaspberry Pi 5B (8G RAM): Run 8B and below parameter models
Raspberry Pi 5B (4G RAM): Run 3B and below parameter models, can't run Phi-3 model.
Phi-3 is a powerful and cost-effective Small Language Model (SLM) launched by Microsoft, which outperforms models of the same size and higher in various language, inference, coding, and mathematical benchmarks.
Model | Parameter |
---|---|
Phi-3(Mini) | 3.8B |
Phi-3(Medium) | 14B |
xRaspberry Pi 5B (8G RAM): Test Phi-3 model with 3.8B parameters!
Using the pull command will automatically pull the models from the Ollama model library.
xxxxxxxxxx
ollama pull phi3:3.8b
If the system does not have a running model, the system will automatically obtain the Phi-3 3.8B model and run it.
xxxxxxxxxx
ollama run phi3:3.8b
xxxxxxxxxx
Write a joke about dogs
xxxxxxxxxx
Change another one
xxxxxxxxxx
Tell me a bedtime story
The time to reply to the question is related to the hardware configuration, please be patient.
You can end the conversation by using the shortcut key 'Ctrl+d' or '/bye'.
Ollama
Website: https://ollama.com/
GitHub: https://github.com/ollama/ollama
Phi-3
Ollama corresponding model: https://ollama.com/library/phi3