What You Will Learn
AMI (Advanced Metering Infrastructure) meters generate millions of data points every day. Hidden in that data are anomalies: unusual voltage readings that signal equipment problems, meter tampering, or phase imbalances. In this guide you will:
- Load 15-minute AMI voltage data from the SP&L dataset
- Explore what "normal" voltage patterns look like
- Train an Isolation Forest model to detect anomalies without labeled data
- Build a simple autoencoder in PyTorch that learns to reconstruct normal patterns
- Flag anomalies based on reconstruction error and evaluate both approaches
What is unsupervised anomaly detection? In Guides 01 and 04, we had labels—we knew which events were outages or failures. But anomaly detection often works without labels. The model learns what "normal" looks like and flags anything that deviates significantly. This is powerful because you don't need to have seen every type of anomaly before—the model catches anything unusual.
SP&L Data You Will Use
- customer_interval_data.csv (
load_customer_interval_data()) — 15-minute AMI data for ~500 sampled customers with customer_id, transformer_id, feeder_id, substation_id, timestamp, demand_kw, energy_kwh, voltage_v, and power_factor - weather_data.csv (
load_weather_data()) — hourly weather for context
Additional Libraries
torch (PyTorch) is used for the autoencoder in the second half. You can complete the Isolation Forest section without it.
Which terminal should I use? On Windows, open Anaconda Prompt from the Start Menu (or PowerShell / Command Prompt if Python is already in your PATH). On macOS, open Terminal from Applications → Utilities. On Linux, open your default terminal. All pip install commands work the same across platforms.
PyTorch on Windows: The command pip install torch installs the CPU-only version, which is all you need for this guide. If you have an NVIDIA GPU and want GPU acceleration, visit pytorch.org/get-started for the platform-specific install command with CUDA support. The CPU version works identically on Windows, macOS, and Linux.
Part A: Isolation Forest
Verify Your Setup
Before starting, verify that your environment is configured correctly. Run this cell first to confirm all dependencies are installed and data files are accessible.
Working directory: All guides assume your working directory is the repository root (Dynamic-Network-Model/). Start Jupyter Lab from there: cd Dynamic-Network-Model && jupyter lab
Having trouble? Check our Troubleshooting Guide for solutions to common setup and data loading issues.
Load and Explore AMI Data
Focus on Voltage Readings
You should see voltage oscillating in a daily pattern. Occasional spikes or dips are the anomalies we want to detect.
Engineer Features for Anomaly Detection
Train the Isolation Forest
How does Isolation Forest work? It builds random decision trees that try to isolate each data point. Normal points are similar to many others and take many splits to isolate. Anomalies are rare and different, so they get isolated quickly with fewer splits. The "anomaly score" reflects how easy a point was to isolate.
Visualize the Anomalies
Part B: Autoencoder (PyTorch)
Build the Autoencoder
An autoencoder is a neural network that learns to compress data into a small representation and then reconstruct it. If the network is trained on normal data, it will reconstruct normal patterns well but struggle with anomalies—producing high reconstruction error.
What is a bottleneck? The bottleneck layer (size 3) forces the network to compress 6 input features into just 3 numbers. This compression forces the model to learn the most important patterns in the data. When an anomaly comes through, it doesn't fit the learned compression pattern, and the reconstruction will be poor.
Train the Autoencoder on Normal Data
Detect Anomalies by Reconstruction Error
Compare Both Methods
Investigate the Anomalies
Anomalies flagged by both methods are the most trustworthy. Look for patterns: are they clustered on specific customers (possible equipment issue), specific times (possible load event), or specific voltages (possible tap changer malfunction)?
What You Built and Next Steps
- Loaded and explored 15-minute AMI voltage data from ~500 sampled customers
- Engineered hourly statistical features from raw voltage readings
- Trained an Isolation Forest for unsupervised anomaly detection
- Built and trained a PyTorch autoencoder on normal voltage patterns
- Flagged anomalies using reconstruction error thresholds
- Compared both methods and investigated high-confidence detections
Ideas to Try Next
- Correlate with outages: Check whether detected anomalies preceded actual outage events in the outage history (
load_outage_history()) - Meter tampering detection: Look for meters with sudden consumption drops but normal voltage (possible bypass)
- Phase imbalance detection: Compare voltage patterns across phases to detect phase-level issues
- Real-time sliding window: Implement a streaming version that processes data in 1-hour windows
- Variational autoencoder: Replace the basic autoencoder with a VAE for probabilistic anomaly scoring
Key Terms Glossary
- Anomaly detection — identifying data points that deviate significantly from normal patterns
- Isolation Forest — an algorithm that detects anomalies by how easily data points can be isolated with random splits
- Autoencoder — a neural network that compresses data and reconstructs it; anomalies have high reconstruction error
- Reconstruction error — the difference between the input and the autoencoder's output; higher = more anomalous
- Unsupervised learning — learning from data without labels; the model discovers structure on its own
- AMI — Advanced Metering Infrastructure; smart meters that report voltage and consumption at 15-minute intervals
- Bottleneck layer — the smallest hidden layer in an autoencoder, forcing information compression
Ready to Level Up?
In the advanced guide, you'll build a Variational Autoencoder for probabilistic anomaly scoring and implement real-time streaming detection.
Go to Advanced Anomaly Detection →