Ollama Custom Model Tutorial: Llama 3.2

Are you still troubled by the varying quality of AI in China and poor performance?

Then let’s take a look at Dev Cat AI (3in1)!

This is an integrated AI assistant that combines GPT-4, Claude3, and Gemini.

It covers all models of the three AI tools.

Including GPT-4o and Gemini flash

Now you can use them for only ¥68.

The official value is ¥420+.

Send “Dev Cat” in the backend to start using.

Become a member now and enjoy one-on-one personal service to ensure your usage is safeguarded.

Ollama Custom Model Tutorial: Llama 3.2

Import the ollama library.

import ollama
Python

Create a class to configure the custom model.

Methods:

  • init: Initializes the model with attributes such as name, system, and temperature.

  • name_custom: Returns the custom name.

  • get_description: Creates the ModelFile structure.

class ModelFile:
    def __init__(self, model: str, name_custom: str, system: str, temp: float = 0.1) -> None:
        self.__model = model
        self.__name_custom = name_custom
        self.__system = system
        self.__temp = temp

    @property
    def name_custom(self):
        return self.__name_custom

    def get_description(self):
        return (
            f"FROM {self.__model}\n"
            f"SYSTEM {self.__system}\n"
            f"PARAMETER temperature {self.__temp}\n"
        )
Python
  • Create a function to list all available models.

  • Output: Returns the list of models registered in ollama.

def ollama_list() -> None:
    response_ollama = ollama.list()
    return response_ollama['models']
Python

Create a function to build a custom model based on the passed configuration.

def ollama_build(custom_config: ModelFile) -> None:
    ollama.create(
        model=custom_config.name_custom,
        modelfile=custom_config.get_description()
    )
Python

Create a function to check if the custom model exists.

def check_custom_model(name_model) -> None:
    models = ollama_list()
    models_names = [model['name'] for model in models]
    if f'{name_model}:latest' in models_names:
        print('Exists')
    else:
        raise Exception('Model does not exist')
Python

Create a function to generate a response based on the provided template and prompt.

def ollama_generate(name_model, prompt) -> None:
    response_ollama = ollama.generate(
        model=name_model,
        prompt=prompt
    )
    print(response_ollama['response'])
Python

Create a function to delete a model by name.

def ollama_delete(name_model) -> None:
    ollama.delete(name_model)
Python

Create a function to sequence the steps of building, validating, and using the model.

def main(custom_config: ModelFile, prompt) -> None:
    ollama_build(custom_config)
    check_custom_model(custom_config.name_custom)
    ollama_generate(custom_config.name_custom, prompt)
    # ollama_delete(custom_config.name_custom)
Python

Set the prompt and configure the model file template.

Input:

  • Model: llama3.2

  • Custom Name: xeroxvaldo_sharopildo

  • System: Smart Anime Assistant

Output: Run the main function to create the model, check if it exists, and generate a response to the prompt.

if __name__ == "__main__":
    prompt: str = 'Who is Naruto Uzumaki ?'
    MF: ModelFile = ModelFile(
        model='llama3.2',
        name_custom='xeroxvaldo_sharopildo',
        system='You are a very smart assistant who knows everything about Anime',
    )
    main(MF, prompt)
Python

Output:

Naruto Uzumaki is the main character of the popular Japanese manga and anime series "Naruto" created by Masashi Kishimoto. He is a young ninja from the Hidden Leaf Village who dreams of becoming Hokage, the leader of the village.

Naruto is known for his determination, bravery, and strong sense of justice. He is also famous for his unique ninja style, which involves using the Nine-Tails chakra (a powerful energy he possesses) to enhance his physical abilities.

Throughout the series, Naruto faces numerous challenges and opponents, including other ninjas from different villages, as well as powerful enemies like members of the Akatsuki and the Ten-Tails Jinchuriki. Despite facing many setbacks and failures, Naruto perseveres and becomes stronger after overcoming each challenge.

Naruto's character development is a central theme of the series as he learns valuable lessons about friendship, sacrifice, and the true meaning of being a ninja. His relationships with teammates Sakura Haruno and Sasuke Uchiha are particularly important in shaping his personality and growth.

The "Naruto" series consists of two main arcs: the original "Naruto" arc (2002-2007) and the "Naruto: Shippuden" arc (2007-2014), with the latter being a continuation of the first arc, featuring a more mature and powerful Naruto.
Overall, Naruto Uzumaki is an iconic anime character who has captured the hearts of millions around the world. His inspiring story and memorable personality make him one of the most beloved characters in anime history!
Plain text
import ollama


class ModelFile:
    def __init__(self, model: str, name_custom: str, system: str, temp: float = 0.1) -> None:
        self.__model = model
        self.__name_custom = name_custom
        self.__system = system
        self.__temp = temp

    @property
    def name_custom(self):
        return self.__name_custom

    def get_description(self):
        return (
            f"FROM {self.__model}\n"
            f"SYSTEM {self.__system}\n"
            f"PARAMETER temperature {self.__temp}\n"
        )


def ollama_list() -> None:
    response_ollama = ollama.list()
    return response_ollama['models']

def ollama_build(custom_config: ModelFile) -> None:
    ollama.create(
        model=custom_config.name_custom,
        modelfile=custom_config.get_description()
    )


def check_custom_model(name_model) -> None:
    models = ollama_list()
    models_names = [model['name'] for model in models]
    if f'{name_model}:latest' in models_names:
        print('Exists')
    else:
        raise Exception('Model does not exist')

def ollama_generate(name_model, prompt) -> None:
    response_ollama = ollama.generate(
        model=name_model,
        prompt=prompt
    )
    print(response_ollama['response'])

def ollama_delete(name_model) -> None:
    ollama.delete(name_model)

def main(custom_config: ModelFile, prompt) -> None:
    ollama_build(custom_config)
    check_custom_model(custom_config.name_custom)
    ollama_generate(custom_config.name_custom, prompt)
    #ollama_delete(custom_config.name_custom)

if __name__ == "__main__":
    prompt: str = 'Who is Naruto Uzumaki ?'
    MF: ModelFile = ModelFile(
        model='llama3.2',
        name_custom='xeroxvaldo_sharopildo',
        system='You are a very smart assistant who knows everything about Anime',
    )
    main(MF, prompt)
Python
Recently, some friends and experts have formed a community for exchanging ideas about RAG and AGENT, where many experts in AnythingLLM and Ollama gather to communicate. If you want to join us, scan the QR code below.
Ollama Custom Model Tutorial: Llama 3.2
Previous popular articles:
① Ollama Model Management Tool – Gollama (78)
② Xorbits Inference: The Strongest Competitor of Ollama (73)
③ Environment Variables That Can Be Set in Ollama (68)

If this is helpful to you, don’t hesitate to click “Share and Read” before you go.🫦

Leave a Comment