Read this in other languages: 简体中文
A command-line tool written in Rust that automatically analyzes code changes and generates standardized Git commit messages by integrating with various AI providers (OpenAI, Azure OpenAI, DeepSeek, Qwen, etc.).
For Windows, Mac OS (10.12+) or Linux, you can download binary releases here.
brew tap mingeme/tap
brew install fuckmitcargo install fuckmitIf you have the Rust toolchain installed (including cargo), you can install directly from the GitHub repository:
cargo install --locked --git https://github.com/mingeme/fuckmitOr clone and build manually:
# Clone repository
git clone https://github.com/mingeme/fuckmit.git
cd fuckmit
# Build project
cargo build --release
# Install binary
cargo install --path .| Provider | Status | Supported Models |
|---|---|---|
| OpenAI | ✅ | GPT-3.5, GPT-4, etc |
| Azure OpenAI | ✅ | GPT-3.5, GPT-4, etc |
| DeepSeek | ✅ | DeepSeek Chat |
| Qwen | ✅ | Qwen Turbo, etc |
export OPENAI_API_KEY="your-openai-api-key"
export OPENAI_MODEL="gpt-4" # Optional, defaults to gpt-3.5-turbo
export OPENAI_BASE_URL="https://api.openai.com/v1" # Optionalexport AZURE_OPENAI_API_KEY="your-azure-api-key"
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com"
export AZURE_OPENAI_DEPLOYMENT="your-deployment-name"
export AZURE_OPENAI_API_VERSION="2024-02-15-preview" # Optionalexport DEEPSEEK_API_KEY="your-deepseek-api-key"
export DEEPSEEK_MODEL="deepseek-chat" # Optional
export DEEPSEEK_BASE_URL="https://api.deepseek.com/v1" # Optionalexport QWEN_API_KEY="your-qwen-api-key"
export QWEN_MODEL="qwen-turbo" # Optional
export QWEN_BASE_URL="https://dashscope.aliyuncs.com/compatible-mode/v1" # Optionalexport LLM_MODEL="deepseek/deepseek-chat" # Required
export LLM_TIMEOUT_SECONDS="30" # Optional, timeout setting
export LLM_MAX_RETRIES="3" # Optional, retry countexport FUCKMIT_RULES="Use English commit messages" # Optional, custom rules for commit message generationNote: The --rules CLI flag takes precedence over the FUCKMIT_RULES environment variable.
export FUCKMIT_INTERACTIVE="true" # Optional, enable interactive mode by default
export FUCKMIT_CANDIDATES="3" # Optional, number of candidates to generate (1-5, default: 3)# Generate and commit Git commit message
fuckmit
# Only display generated commit message, don't actually commit
fuckmit --dry-run
# Use provider/model format
fuckmit --model openai/gpt-4
# Add custom rules via CLI flag
fuckmit --rules "Use English commit messages"
# Or set rules via environment variable (applies to all invocations)
export FUCKMIT_RULES="Use English commit messages"
fuckmit
# Add change context
fuckmit --context "Fixed user login bug"
# Use rules and context together
fuckmit --rules "Use concise descriptions" --context "Refactored database connection logic"
# Custom AI parameters
fuckmit --max-tokens 1000 --temperature 0.5Interactive mode allows you to select from multiple AI-generated commit message candidates and edit them before committing.
# Enable interactive mode with default 3 candidates
fuckmit --interactive
# Generate 5 candidates to choose from
fuckmit --interactive --candidates 5
# Use environment variable to enable by default
export FUCKMIT_INTERACTIVE=true
fuckmitInteractive Mode Features:
- Generate multiple commit message candidates in parallel
- Select from a numbered list of options
- Edit selected message in your default editor (
$EDITORor$VISUAL) - Quick edit mode for type/scope/description
- Automatic deduplication of identical messages
- Graceful fallback to non-interactive mode in non-TTY environments
-d, --dry-run: Only display generated commit message, don't execute commit-i, --interactive: Enable interactive mode to select from multiple candidates--candidates <NUM>: Number of candidates to generate in interactive mode (1-5, default: 3)-m, --model <MODEL>: Specify AI model or use "provider/model" format-r, --rules <RULES>: Custom commit message generation rules-c, --context <CONTEXT>: Provide additional context for changes--max-tokens <NUM>: Maximum tokens for generated message (default: 8192)--temperature <NUM>: AI generation temperature parameter, range 0.0-2.0 (default: 0.7)
This project is open source under the MIT License - see the LICENSE file for details.