Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 28 additions & 5 deletions INSTRUCTIONS.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ This directory contains the MCP servers and infrastructure for the AssetOpsBench
- [Utilities](#utilities)
- [FMSRAgent](#fmsragent)
- [TSFMAgent](#tsfmagent)
- [VibrationAgent](#vibrationagent)
- [Plan-Execute Runner](#plan-execute-runner)
- [How it works](#how-it-works)
- [CLI](#cli)
Expand Down Expand Up @@ -158,6 +159,23 @@ uv run tsfm-mcp-server
| `run_tsad` | `dataset_path`, `tsfm_output_json`, `timestamp_column`, `target_columns`, `task?`, `false_alarm?`, `ad_model_type?`, ... | Conformal anomaly detection on top of a forecasting output JSON; returns CSV with anomaly labels |
| `run_integrated_tsad` | `dataset_path`, `timestamp_column`, `target_columns`, `model_checkpoint?`, `false_alarm?`, `n_calibration?`, ... | End-to-end forecasting + anomaly detection in one call; returns combined CSV |

### VibrationAgent

**Path:** `src/servers/vibration/main.py`
**Requires:** CouchDB (same env vars as IoTAgent: `COUCHDB_URL`, `COUCHDB_DBNAME`, `COUCHDB_USERNAME`, `COUCHDB_PASSWORD`); `numpy`, `scipy`
**DSP core:** `src/servers/vibration/dsp/` — adapted from [vibration-analysis-mcp](https://github.com/LGDiMaggio/claude-stwinbox-diagnostics/tree/main/mcp-servers/vibration-analysis-mcp) (Apache-2.0)

| Tool | Arguments | Description |
|---|---|---|
| `get_vibration_data` | `site_name`, `asset_id`, `sensor_name`, `start`, `final?` | Fetch vibration time-series from CouchDB and load into the analysis store. Returns a `data_id`. |
| `list_vibration_sensors` | `site_name`, `asset_id` | List available sensor fields for an asset. |
| `compute_fft_spectrum` | `data_id`, `window?`, `top_n?` | Compute FFT amplitude spectrum (top-N peaks + statistics). |
| `compute_envelope_spectrum` | `data_id`, `band_low_hz?`, `band_high_hz?`, `top_n?` | Compute envelope spectrum for bearing fault detection (Hilbert transform). |
| `assess_vibration_severity` | `rms_velocity_mm_s`, `machine_group?` | Classify vibration severity per ISO 10816 (Zones A–D). |
| `calculate_bearing_frequencies` | `rpm`, `n_balls`, `ball_diameter_mm`, `pitch_diameter_mm`, `contact_angle_deg?`, `bearing_name?` | Compute bearing characteristic frequencies (BPFO, BPFI, BSF, FTF). |
| `list_known_bearings` | — | List all bearings in the built-in database. |
| `diagnose_vibration` | `data_id`, `rpm?`, `bearing_designation?`, `bearing_*?`, `bpfo_hz?`, `bpfi_hz?`, `bsf_hz?`, `ftf_hz?`, `machine_group?`, `machine_description?` | Full automated diagnosis: FFT + shaft features + bearing envelope + ISO 10816 + fault classification + markdown report. |

---

## Plan-Execute Runner
Expand Down Expand Up @@ -225,7 +243,7 @@ uv run plan-execute --show-history --json "How many observations exist for CH-1?

### End-to-end example

All four servers (IoTAgent, Utilities, FMSRAgent, TSFMAgent) are registered by default.
All five servers (IoTAgent, Utilities, FMSRAgent, TSFMAgent, VibrationAgent) are registered by default.
Run a question that exercises three of them with independent parallel steps:

```bash
Expand Down Expand Up @@ -304,6 +322,7 @@ runner = PlanExecuteRunner(
"Utilities": "utilities-mcp-server",
"FMSRAgent": "fmsr-mcp-server",
"TSFMAgent": "tsfm-mcp-server",
"VibrationAgent": "vibration-mcp-server",
},
)
```
Expand Down Expand Up @@ -334,6 +353,10 @@ Add the following to your Claude Desktop `claude_desktop_config.json`:
"TSFMAgent": {
"command": "/path/to/uv",
"args": ["run", "--project", "/path/to/AssetOpsBench", "tsfm-mcp-server"]
},
"VibrationAgent": {
"command": "/path/to/uv",
"args": ["run", "--project", "/path/to/AssetOpsBench", "vibration-mcp-server"]
}
}
}
Expand Down Expand Up @@ -396,8 +419,8 @@ uv run pytest src/ -v
│ │ stdio │ │
└───────────────────┼────────────┼─────────────────────┘
│ MCP protocol (stdio)
┌──────────┼──────────┬──────────┐
▼ ▼ ▼ ▼
IoTAgent Utilities FMSRAgent TSFMAgent
(tools) (tools) (tools) (tools)
┌──────────┼──────────┬──────────┬───────────────
▼ ▼ ▼ ▼
IoTAgent Utilities FMSRAgent TSFMAgent VibrationAgent
(tools) (tools) (tools) (tools) (tools)
```
2 changes: 2 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ dependencies = [
"pyyaml>=6.0",
"litellm>=1.0",
"python-dotenv>=1.0",
"scipy>=1.10.0",
]

[project.scripts]
Expand All @@ -32,6 +33,7 @@ iot-mcp-server = "servers.iot.main:main"
utilities-mcp-server = "servers.utilities.main:main"
fmsr-mcp-server = "servers.fmsr.main:main"
tsfm-mcp-server = "servers.tsfm.main:main"
vibration-mcp-server = "servers.vibration.main:main"


[dependency-groups]
Expand Down
5 changes: 5 additions & 0 deletions src/servers/vibration/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# Vibration Analysis MCP Server for AssetOpsBench
#
# DSP core adapted from vibration-analysis-mcp by Luigi Di Maggio (LGDiMaggio)
# https://github.com/LGDiMaggio/claude-stwinbox-diagnostics
# SPDX-License-Identifier: Apache-2.0
139 changes: 139 additions & 0 deletions src/servers/vibration/couchdb_client.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,139 @@
"""CouchDB client for fetching vibration sensor data.

Reuses the same environment variables as the IoT MCP server
(COUCHDB_URL, COUCHDB_DBNAME, COUCHDB_USERNAME, COUCHDB_PASSWORD).
"""

from __future__ import annotations

import logging
import os
from datetime import datetime
from typing import Optional

import couchdb3
import numpy as np
from dotenv import load_dotenv

load_dotenv()
logger = logging.getLogger("vibration-mcp-server")

COUCHDB_URL = os.environ.get("COUCHDB_URL")
COUCHDB_DBNAME = os.environ.get("COUCHDB_DBNAME")
COUCHDB_USER = os.environ.get("COUCHDB_USERNAME")
COUCHDB_PASSWORD = os.environ.get("COUCHDB_PASSWORD")


def _get_db() -> Optional[couchdb3.Database]:
"""Lazy CouchDB connection with error handling."""
if not COUCHDB_URL:
logger.warning("COUCHDB_URL not set — vibration data from CouchDB unavailable")
return None
try:
return couchdb3.Database(
COUCHDB_DBNAME,
url=COUCHDB_URL,
user=COUCHDB_USER,
password=COUCHDB_PASSWORD,
)
except Exception as e:
logger.error(f"CouchDB connection failed: {e}")
return None


def fetch_vibration_timeseries(
asset_id: str,
sensor_name: str,
start: str,
final: Optional[str] = None,
limit: int = 10000,
) -> Optional[tuple[np.ndarray, float]]:
"""
Fetch sensor time-series from CouchDB and return as numpy array.

Queries CouchDB for documents matching the given asset_id and time range,
extracts values from the specified sensor column, and estimates the
sample rate from the timestamp spacing.

Args:
asset_id: Asset identifier (e.g., 'Chiller 6').
sensor_name: Name of the sensor field in CouchDB documents.
start: ISO 8601 start timestamp.
final: Optional ISO 8601 end timestamp.
limit: Maximum number of documents to fetch.

Returns:
(signal_array, estimated_sample_rate) or None on error.
"""
db = _get_db()
if not db:
return None

try:
selector: dict = {
"asset_id": asset_id,
"timestamp": {"$gte": datetime.fromisoformat(start).isoformat()},
}
if final:
selector["timestamp"]["$lt"] = datetime.fromisoformat(final).isoformat()

res = db.find(
selector,
limit=limit,
sort=[{"asset_id": "asc"}, {"timestamp": "asc"}],
)
except Exception as e:
logger.error(f"CouchDB query failed: {e}")
return None

docs = res.get("docs", [])
if not docs:
logger.info(f"No documents found for {asset_id}/{sensor_name}")
return None

# Extract single sensor column
values: list[float] = []
timestamps: list[str] = []
for doc in docs:
if sensor_name in doc and "timestamp" in doc:
try:
values.append(float(doc[sensor_name]))
timestamps.append(doc["timestamp"])
except (ValueError, TypeError):
continue

if len(values) < 2:
logger.info(
f"Insufficient data points ({len(values)}) for {asset_id}/{sensor_name}"
)
return None

signal = np.array(values, dtype=np.float64)

# Estimate sample rate from timestamp differences
try:
ts = [datetime.fromisoformat(t) for t in timestamps]
diffs = [(ts[i + 1] - ts[i]).total_seconds() for i in range(len(ts) - 1)]
avg_dt = sum(diffs) / len(diffs)
sample_rate = 1.0 / avg_dt if avg_dt > 0 else 1.0
except Exception:
sample_rate = 1.0 # fallback: 1 Hz

return signal, sample_rate


def list_sensor_fields(asset_id: str) -> list[str]:
"""Return the sensor field names available for an asset in CouchDB."""
db = _get_db()
if not db:
return []
try:
res = db.find({"asset_id": asset_id}, limit=1)
if not res["docs"]:
return []
doc = res["docs"][0]
exclude = {"_id", "_rev", "asset_id", "timestamp"}
return sorted(k for k in doc.keys() if k not in exclude)
except Exception as e:
logger.error(f"Error listing sensors for {asset_id}: {e}")
return []
142 changes: 142 additions & 0 deletions src/servers/vibration/data_store.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,142 @@
# SPDX-License-Identifier: Apache-2.0
# Adapted from https://github.com/LGDiMaggio/claude-stwinbox-diagnostics/tree/main/mcp-servers/vibration-analysis-mcp
"""
Server-side data store for vibration signals.

Signals are kept in memory on the MCP server so that large numpy arrays
never need to transit through the LLM conversation context. The agent
only sees compact summaries (statistics, peak lists, diagnosis reports).
"""

from __future__ import annotations

import hashlib
import time
from dataclasses import dataclass, field

import numpy as np
from numpy.typing import NDArray


def _kurtosis(x: NDArray) -> float:
"""Excess kurtosis (Fisher definition, normal = 0)."""
n = len(x)
if n < 4:
return 0.0
m = np.mean(x)
s = np.std(x, ddof=1)
if s < 1e-15:
return 0.0
return float(np.mean(((x - m) / s) ** 4) - 3.0)


@dataclass
class DataEntry:
"""A stored signal with metadata."""

signal: NDArray[np.floating]
sample_rate: float
created_at: float = field(default_factory=time.time)
metadata: dict = field(default_factory=dict)

@property
def n_samples(self) -> int:
return self.signal.shape[0]

@property
def n_channels(self) -> int:
return self.signal.shape[1] if self.signal.ndim > 1 else 1

@property
def duration_s(self) -> float:
return self.n_samples / self.sample_rate if self.sample_rate > 0 else 0

def summary(self) -> dict:
"""Return a compact summary (no raw data)."""
sig = self.signal
if sig.ndim == 1:
sig = sig.reshape(-1, 1)

channel_stats = {}
labels = self.metadata.get(
"axis_labels",
["X", "Y", "Z"][: sig.shape[1]]
if sig.shape[1] <= 3
else [f"CH{i}" for i in range(sig.shape[1])],
)
for i, label in enumerate(labels):
if i >= sig.shape[1]:
break
col = sig[:, i]
rms = float(np.sqrt(np.mean(col**2)))
channel_stats[label] = {
"mean": round(float(np.mean(col)), 6),
"std": round(float(np.std(col)), 6),
"rms": round(rms, 6),
"peak": round(float(np.max(np.abs(col))), 6),
"crest_factor": round(float(np.max(np.abs(col)) / rms), 2)
if rms > 0
else 0,
"kurtosis": round(float(_kurtosis(col)), 2),
}

return {
"n_samples": self.n_samples,
"n_channels": self.n_channels,
"sample_rate_hz": self.sample_rate,
"duration_s": round(self.duration_s, 4),
"channel_stats": channel_stats,
"metadata": {
k: v for k, v in self.metadata.items() if k != "axis_labels"
},
}


class DataStore:
"""Simple in-memory store for vibration signals."""

def __init__(self) -> None:
self._entries: dict[str, DataEntry] = {}

def put(
self,
data_id: str,
signal: NDArray[np.floating],
sample_rate: float,
metadata: dict | None = None,
) -> str:
"""Store a signal. Returns the data_id."""
self._entries[data_id] = DataEntry(
signal=np.asarray(signal, dtype=np.float64),
sample_rate=sample_rate,
metadata=metadata or {},
)
return data_id

def put_auto(
self,
signal: NDArray[np.floating],
sample_rate: float,
metadata: dict | None = None,
) -> str:
"""Store a signal with an auto-generated ID."""
h = hashlib.md5(signal.tobytes()[:1024]).hexdigest()[:8]
data_id = f"sig_{h}_{int(time.time()) % 100000}"
return self.put(data_id, signal, sample_rate, metadata)

def get(self, data_id: str) -> DataEntry | None:
return self._entries.get(data_id)

def remove(self, data_id: str) -> bool:
return self._entries.pop(data_id, None) is not None

def list_ids(self) -> list[str]:
return list(self._entries.keys())

def list_entries(self) -> list[dict]:
"""Return summaries of all stored entries."""
return [{"data_id": k, **v.summary()} for k, v in self._entries.items()]


# Global singleton — shared across all tools in this server
store = DataStore()
Loading