In a Latest Machine Learning Research, Salesforce AI Team Developed a New Time-Series Forecasting Model Called ETSformer Which Exploits The Principle of Exponential Smoothing in Improving Transformers for Time-Series Forecasting

Time-series forecasting has gained tremendous importance in recent times. The task of making future predictions based on previous data, particularly numerical data gathered in a temporal or sequentially ordered fashion, is known as time series forecasting. Accurate forecasting of such data has several advantages in various fields, such as anticipating e-commerce sales, which enables businesses to improve supply chain decisions and develop more effective pricing strategies. Another significant area where forecasting is crucial is AIOps. Artificial intelligence and machine learning can analyze vast amounts of time-series data frequently produced by IT operations to increase operational efficiency dramatically. Exponential smoothing is used when working with time series data during forecasting. It is a group of techniques inspired by the notion that forecasts are weighted averages of historical observations, with weights decrementing exponentially backward in time. In short, recent data is given more weight than historical data, reflecting the idea that recent history should be more significant for formulating new predictions or recognizing current patterns.

Trend and seasonality are the two most common patterns found in time-series data. The development of forecasting techniques relies heavily on the decomposition of time series into trend and seasonality components since doing so makes it simpler to evaluate each component separately and produces more accurate forecasts. Trend and seasonality decompositions consider the prior knowledge related to time-series structures combined with an exponentially-weighted decay while forecasting models. The techniques’ popularity and accuracy in forecasting are a blatant indication of their advantages. Simple statistical models are no longer practical due to the abundance of time-series metrics; more potent machine learning and deep learning models are utilized to produce precise long-term projections. However, time-series data is frequently erratic and noisy, and current approaches do not adequately consider trends and seasonality. Existing approaches may take into account some prior knowledge, but they are not time-series specific, which causes erroneous long-term projections and inadequate modeling of temporal patterns. 

Researchers from Salesforce have introduced the ETSformer time-series forecasting method as a solution to the shortcomings of existing approaches. Their “exponential smoothing transformers” method uses a transformer that can be configured to handle time-series data. ETSformer, inspired by traditional exponential smoothing techniques, combines the strength of transformers with their efficiency to produce cutting-edge performance. The system’s architecture mainly consists of a transformer, which includes an encoder and a decoder and is fundamental to the three main processes. The encoder handles the decomposition step. The time series is used as the input, and the level, growth, and seasonality components are extracted from it. These elements are given to the decoder during the extrapolation process, which extrapolates them into the future. The extrapolated components are combined into a single future projection in the last stage before leaving the decoder.

The team has demonstrated through numerous evaluations that their approach of fusing traditional exponential smoothing techniques with a transformer architecture is more than a sound notion in theory. By obtaining cutting-edge performance over six real-world time-series datasets from various application domains, including traffic forecasting, financial time-series forecasting, etc., ETSformer demonstrates the effectiveness of its methodology. The team’s study report offers a more thorough analysis of the empirical findings and comparisons with other baselines. Furthermore, ETSformer generates interpretable decompositions of the anticipated quantities instead of noisy, erroneous decompositions, demonstrating a clear trend and seasonal pattern. The fact that the system creates forecasts based on a combination of comprehensible time-series components is notable. This makes it possible to visualize each element separately and comprehend how seasonality and trend affect forecasts. This interpretability is a crucial component since it allows the decisions made by AI systems to be as transparent as possible. ETSformer’s achievement of SOTA performance proves that integrating ETS methods with a transformer-based design can positively affect the real world. The team has open-sourced their code to encourage the additional study and commercial uses of ETSformer for time-series forecasting.

This Article is written as a research summary article by Marktechpost Staff based on the research paper 'ETSformer: Exponential Smoothing Transformers for Time-series Forecasting'. All Credit For This Research Goes To Researchers on This Project. Check out the paper, github link and reference article.

Please Don't Forget To Join Our ML Subreddit


Khushboo Gupta is a consulting intern at MarktechPost. She is currently pursuing her B.Tech from the Indian Institute of Technology(IIT), Goa. She is passionate about the fields of Machine Learning, Natural Language Processing and Web Development. She enjoys learning more about the technical field by participating in several challenges.


Credit: Source link

Comments are closed.