
Ollama is a tool designed for running and customizing large language models in local environments. It provides a simple yet efficient interface for creating, running, and managing these models, along with a rich library of pre-built models that can be easily integrated into various applications.
Download link: https://ollama.com/download

After installation, open the command window and type ollama to see the command set functionality description. These commands will help us manage different large models, including installation and running of the models.

# Command Descriptions
ollama --version # Display the currently installed ollama version.
ollama serve # Start the service, ollama service listens by default at http://localhost:11434.
ollama create <model_name> [-f <modelfile_path>] # Create a model
ollama show <model_name> # View model information <model_name>: The name of the model to query.
ollama run <model_name> # Run the specified model. <model_name>: The name of the model to run.
ollama stop <model_name> # Stop the running model. <model_name>: The name of the model to stop.
ollama pull <model_name> # Pull the specified model from the registry. <model_name>: The name of the model to pull.
ollama push <model_name> # Push the local model to the registry. <model_name>: The name of the model to push.
ollama list # List all downloaded models.
ollama ps # List all running models.
ollama cp <source_model> <destination_model> # Copy one model to another newly named model.
ollama rm <model_name> # Delete the specified model. <model_name>: The name of the model to delete.
3. DeepSeek-Coder
DeepSeek-Coder is a code language model developed by DeepSeek, trained on a large-scale dataset of code and natural language. This model supports project-level code completion and filling tasks, achieving outstanding performance, and leading levels among open-source code models across various programming languages and benchmarks.

# Pull
ollama pull deepseek-coder
# Run
ollama run deepseek-coder
# The default interface address is http://localhost:11434




