Skip to content

Releases: codeMaestro78/MLcli

Release v0.3.1

17 Jan 20:01

Choose a tag to compare

Version bump to 0.3.1 with latest updates.

mlcli-toolkit v0.3.0

14 Dec 13:51

Choose a tag to compare

🚀 Release v0.3.0 - New ML Algorithms!

What's New

Gradient Boosting

  • LightGBM: Fast gradient boosting with leaf-wise tree growth, early stopping, and feature importance
  • CatBoost: Gradient boosting with excellent categorical feature handling

Clustering Algorithms

  • K-Means: Partition-based clustering with silhouette metrics and automatic optimal K detection
  • DBSCAN: Density-based clustering with automatic noise detection

Anomaly Detection

  • Isolation Forest: Tree-based anomaly detection using isolation principle
  • One-Class SVM: Novelty detection using support vector methods

Installation

pip install mlcli-toolkit

Quick Start

# Train LightGBM
mlcli train -d data.csv -m lightgbm --target label

# Clustering with K-Means
mlcli train -d data.csv -m kmeans

# Anomaly detection
mlcli train -d data.csv -m isolation_forest

mlcli-toolkit v0.2.0

07 Dec 09:18

Choose a tag to compare

🚀 Release v0.2.0 - Production Ready!

What's New

  • Documentation: Comprehensive docs/, examples/, and tests/ folders
  • Contributing: CONTRIBUTING.md, CODE_OF_CONDUCT.md, SECURITY.md
  • Testing: pytest framework with trainer and tuner tests
  • Examples: Sample configs for all models and tuning strategies
  • Improved: GridSearchTuner parameter handling
  • Cleaned: Production-ready structure

Features

  • CLI Training: Train ML models from the command line
  • Multiple Algorithms: Random Forest, XGBoost, SVM, Logistic Regression
  • Deep Learning: TensorFlow DNN, CNN, RNN trainers
  • Experiment Tracking: Track and compare experiment runs
  • Hyperparameter Tuning: Grid, Random, and Bayesian optimization
  • Model Explainability: SHAP and LIME explanations
  • Data Preprocessing: StandardScaler, Normalization, Encoding, Feature Selection

Installation

pip install mlcli-toolkit