MNN/apps/mnncli
若遗 52d5e90ba6 Update mnncli build script for Linux support 2026-01-14 20:19:28 +08:00
..
examples
include refactor mdoel downloader 2025-12-23 19:56:44 +08:00
src fix inifite chars in mnncli interactive chat mode 2026-01-14 20:08:24 +08:00
test refactor mnncli add tests 2025-10-29 16:58:24 +08:00
AGENTS.md fix inifite chars in mnncli interactive chat mode 2026-01-14 20:08:24 +08:00
CMakeLists.txt Update mnncli build script for Linux support 2026-01-14 20:19:28 +08:00
README.md Update mnncli build script for Linux support 2026-01-14 20:19:28 +08:00
android_build.sh
android_debug.sh
build.sh Update mnncli build script for Linux support 2026-01-14 20:19:28 +08:00
build_integrated.sh
clean.sh
test.sh refactor mnncli add tests 2025-10-29 16:58:24 +08:00
todo.md refactor mnncli add tests 2025-10-29 16:58:24 +08:00

README.md

MNNCLI

Note: This project is under active development and may contain bugs or unfinished features. Use with caution.

MNNCLI is a command-line interface tool for MNN (Mobile Neural Network) that provides various functionalities for working with LLM models.

Features

  • Model Management: List, download, and delete models
  • Model Serving: Start a web server to serve models via HTTP API
  • Model Execution: Run models with prompts or prompt files
  • Benchmarking: Performance benchmarking for models
  • Model Search: Search for models in the Hugging Face repository

Building

To build MNNCLI, run the following commands from the mnncli directory:

sh build.sh

The executable will be located at build_mnncli/mnncli.

Usage

List Models

./build_mnncli/mnncli list

Serve Model

./build_mnncli/mnncli serve <model_name>

Run Model

./build_mnncli/mnncli run <model_name> [-c config_path] [-p prompt] [-f prompt_file]

Benchmark Model

./build_mnncli/mnncli benchmark <model_name> [-c config_path]

Download Model

./build_mnncli/mnncli download <model_name> <repo_name>

Search Models

./build_mnncli/mnncli search <keyword>

Delete Model

./build_mnncli/mnncli delete <model_name>

Dependencies

  • OpenSSL (for HTTPS support)
  • MNN core library
  • LLM engine library

Notes

  • The tool requires macOS 13.0+ when building on Apple platforms
  • On Linux, ensure libssl-dev (or equivalent) is installed
  • Models are cached in the user's cache directory
  • The web server provides an OpenAI-compatible API interface