NAME
hf-model-tool — A CLI tool for managing your locally downloaded Huggingface models and datasets
SYNOPSIS
pip install hf-model-toolINFO
DESCRIPTION
A CLI tool for managing your locally downloaded Huggingface models and datasets
README
HF-MODEL-TOOL
A CLI tool for managing your locally downloaded Huggingface models and datasets
Disclaimer: This tool is not affiliated with or endorsed by Hugging Face. It is an independent, community-developed utility.
Screenshots
Welcome Screen

List All Assets

Features
Core Functionality
- Smart Asset Detection: Detect HuggingFace models, datasets, LoRA adapters, fine-tuned models, Ollama models, and custom formats
- Asset Listing: View all your AI assets with size information and metadata
- Duplicate Detection: Find and clean duplicate downloads to save disk space
- Asset Details: View model configurations and dataset documentation with rich formatting
- Directory Management: Add and manage custom directories containing your AI assets
- Manifest System: Customize model names, publishers, and metadata with JSON manifests
Supported Asset Types
- HuggingFace Models & Datasets: Standard cached downloads from Hugging Face Hub
- LoRA Adapters: Fine-tuned adapters from training frameworks like Unsloth
- Custom Models: Fine-tuned models, merged models, and other custom formats
- Ollama Models: GGUF format models from Ollama (both user and system directories)
Installation
From PyPI (Recommended)
pip install hf-model-tool
From Source
git clone https://github.com/Chen-zexi/hf-model-tool.git
cd hf-model-tool
pip install -e .
Usage
Interactive Mode
hf-model-tool
Launches the interactive CLI with:
- System status showing assets across all configured directories
- Asset management tools for all supported formats
- Easy directory configuration and management
Integration with vLLM-CLI
The tool provides API specifically designed for vLLM-CLI for model discovery and management.
Also can be launched directly from vLLM-CLI
Serving Custom Models in vLLM-CLI
For detailed instructions on serving models from custom directories with vLLM-CLI, see:
- Custom Model Serving Guide - Complete guide for vLLM-CLI integration
Python API Usage
from hf_model_tool import get_downloaded_models from hf_model_tool.api import HFModelAPIQuick access to models
models = get_downloaded_models()
Full API access
api = HFModelAPI() api.add_directory("/path/to/models", "custom") assets = api.list_assets()
See API Reference for complete documentation.
Command Line Usage
The tool provides comprehensive command-line options for direct operations:
Basic Commands
# Launch interactive mode hf-model-toolList all detected assets
hf-model-tool -l hf-model-tool --list
Enter asset management mode
hf-model-tool -m hf-model-tool --manage
View detailed asset information
hf-model-tool -v hf-model-tool --view hf-model-tool --details
Show version
hf-model-tool --version
Show help
hf-model-tool -h hf-model-tool --help
Directory Management
# Add a directory containing LoRA adapters hf-model-tool -path ~/my-lora-models hf-model-tool --add-path ~/my-lora-modelsAdd a custom model directory
hf-model-tool -path /data/custom-models
Add current working directory
hf-model-tool -path .
Add with absolute path
hf-model-tool -path /home/user/ai-projects/models
Sorting Options
# List assets sorted by size (default) hf-model-tool -l --sort sizeList assets sorted by name
hf-model-tool -l --sort name
List assets sorted by date
hf-model-tool -l --sort date
Interactive Navigation
- ↑/↓ arrows: Navigate menu options
- Enter: Select current option
- Back: Select to return to previous menu
- Config: Select to access settings and directory management
- Main Menu: Select to return to main menu from anywhere
- Exit: Select to clean application shutdown
- Ctrl+C: Force exit
Key Workflows
- Directory Setup: Add directories containing your AI assets (HuggingFace cache, LoRA adapters, custom models)
- List Assets: View all detected assets with size information across all directories
- Manage Assets: Delete unwanted files and deduplicate identical assets
- View Details: Inspect model configurations and dataset documentation
- Configuration: Manage directories, change sorting preferences, and access help
Documentation
Quick Links
- Manifest System Guide - Learn how to customize model metadata with JSON manifests
- Custom Directories Guide - Configure and manage custom model directories
- API Reference - Complete API documentation for programmatic usage
Configuration
Directory Management
Add custom directories containing your AI assets:
- HuggingFace Cache: Standard HF cache with
models--publisher--namestructure - Custom Directory: LoRA adapters, fine-tuned models, or other custom formats
- Auto-detect: Let the tool automatically determine the directory type
Interactive Configuration
Access via "Config" from any screen:
- Directory Management: Add, remove, and test directories
- Sort Options: Size (default), Date, or Name
- Help System: Navigation and usage guide
Manifest System
Automatic Generation: When you add a custom directory, the tool automatically generates a models_manifest.json file that:
- Becomes the primary source for model information
- Is always read first for classification
- Can be edited to ensure accurate display in vLLM-CLI
Customize model metadata using JSON manifests:
- Define custom names for your models
- Specify publishers and organizations
- Add notes and documentation
- See Manifest System Guide for details
Important: Review and edit auto-generated manifests to ensure model names and publishers are accurate for your use case.
Project Structure
hf_model_tool/
├── __main__.py # Application entry point with welcome screen
├── cache.py # Multi-directory asset scanning
├── ui.py # Rich terminal interface components
├── utils.py # Asset grouping and duplicate detection
├── navigation.py # Menu navigation
├── config.py # Configuration and directory management
└── asset_detector.py # Asset detection (LoRA, custom models, etc.)
Development
Requirements
- Python ≥ 3.7
- Dependencies:
rich,inquirer,html2text
Logging
Application logs are written to ~/.hf-model-tool.log for debugging and monitoring.
Configuration Storage
Settings and directory configurations are stored in ~/.config/hf-model-tool/config.json
Contributing
We welcome contributions from the community! Please feel free to:
- Open an issue at GitHub Issues
- Submit a pull request with your improvements
- Share feedback about your experience using the tool
License
This project is licensed under the MIT License - see the LICENSE file for details.