EMAformer: Enhancing Transformer through Embedding Armor for Time Series Forecasting

arXiv — cs.LGWednesday, November 12, 2025 at 5:00:00 AM
The introduction of EMAformer marks a significant advancement in multivariate time series forecasting, particularly in enhancing the Transformer architecture, which has previously struggled against MLP-based models. By addressing unstable inter-channel relationships, EMAformer incorporates three key inductive biases: global stability, phase sensitivity, and cross-axis specificity. These innovations have led to state-of-the-art performance on 12 real-world benchmarks, with a notable reduction in forecasting errors—2.73% in MSE and 5.15% in MAE. This progress not only highlights the potential of Transformer-based approaches but also underscores the importance of continuous improvement in forecasting models, which are crucial across various sectors. The code for EMAformer is publicly available on GitHub, encouraging further exploration and application of this model in real-world scenarios.
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it

Recommended Readings
MoCap2Radar: A Spatiotemporal Transformer for Synthesizing Micro-Doppler Radar Signatures from Motion Capture
PositiveArtificial Intelligence
The article presents a machine learning approach for synthesizing micro-Doppler radar spectrograms from Motion-Capture (MoCap) data. It formulates the translation as a windowed sequence-to-sequence task using a transformer-based model that captures spatial relations among MoCap markers and temporal dynamics across frames. Experiments demonstrate that the method produces plausible radar spectrograms and shows good generalizability, indicating its potential for applications in edge computing and IoT radars.