AIResearch AIResearch
Back to articles
Coding

DDTime: A Lightweight Plugin for Time-Series Dataset Distillation

In the rapidly evolving field of time-series forecasting, the computational demands of training models on massive datasets have become a significant bottleneck. Dataset distillation offers a promising…

AI Research
March 26, 2026
2 min read
DDTime: A Lightweight Plugin for Time-Series Dataset Distillation

In the rapidly evolving field of time-series forecasting, the computational demands of training models on massive datasets have become a significant bottleneck. Dataset distillation offers a promising solution by synthesizing compact datasets that preserve the learning behavior of full data, but extending this technique to time-series forecasting has proven challenging due to temporal biases and lack of diversity in synthetic samples. A new framework called DDTime addresses these fundamental limitations through innovative spectral alignment and information bottleneck techniques, achieving substantial performance gains while introducing minimal computational overhead.

DDTime tackles two core s that have hindered previous attempts at time-series dataset distillation. First, strong autocorrelation in temporal data creates distorted value-term alignment between teacher and student models, where traditional pointwise alignment mixes trend, seasonal, and high-frequency components within a single loss. Second, unlike classification tasks with inherent categorical priors, time-series forecasting lacks explicit diversity mechanisms, causing synthetic trajectories to become redundant and reducing overall information density. The framework builds upon first-order condensation decomposition established in prior work but introduces crucial innovations to overcome these limitations.

Ology employs frequency-domain alignment to mitigate autocorrelation-induced bias, transforming the traditional temporal domain value term into a spectral space where different frequency components become asymptotically uncorrelated. This approach ensures spectral consistency while preserving fine-grained temporal fidelity, effectively decorrelating horizon-wise dependencies that plague traditional distillation s. Additionally, DDTime introduces an inter-sample regularization inspired by the information bottleneck principle, which enhances diversity among synthetic trajectories by minimizing pairwise redundancy and maximizing task-relevant information across the distilled dataset.

Experimental across 20 benchmark datasets demonstrate DDTime's consistent superiority over existing distillation s. The framework achieves approximately 30% relative accuracy gains while introducing only about 2.49% computational overhead, making it both effective and efficient. Comprehensive testing with diverse forecasting architectures including Transformer-based models, MLP-based approaches, and patch-based s shows robust performance improvements, with particularly effective when integrated into trajectory-matching based condensation paradigms where both parameter and value terms are jointly optimized.

Of this research extend beyond immediate performance improvements, offering a more sustainable approach to time-series model development by reducing computational requirements while maintaining or even enhancing forecasting accuracy. The framework's compatibility with various distillation paradigms and forecasting architectures suggests broad applicability across domains including energy, finance, transportation, and environmental monitoring. As dataset sizes continue to grow exponentially in time-series applications, techniques like DDTime will become increasingly valuable for making advanced forecasting models more accessible and environmentally sustainable.

Original Source

Read the complete research paper

View on arXiv

About the Author

Guilherme A.

Guilherme A.

Former dentist (MD) from Brazil, 41 years old, husband, and AI enthusiast. In 2020, he transitioned from a decade-long career in dentistry to pursue his passion for technology, entrepreneurship, and helping others grow.

Connect on LinkedIn