Skip to main content

Local LLaMA Models in SmartAdmin

Overview

  • Handler: llm-handler.js
  • Integration: Uses node-llama-cpp for local model execution
  • Model Format: GGUF format models (.gguf files)
  • Configuration: Configurable via settings-handler.js
  • Advantages:
    • Complete privacy and offline operation
    • No API costs
    • Customizable models selection

Downloading GGUF Models

You can download GGUF format models from Hugging Face or other model repositories. Some popular models include:

Ensure you have the necessary rights to use the models you download.

You may also create or train your own GGUF models and use them with SmartAdmin. (Highly advanced)