Pendekatan Transformer Deep Learning dalam Meramalkan Harga Minyak Sumatran Light Crude
DOI:
https://doi.org/10.53863/kst.v7i02.1993Keywords:
Deep-learning, Forecasting, SLC, TransformerAbstract
Time series forecasting plays an important role in understanding the dynamics of volatile data that depends on long-term historical patterns, such as crude oil prices. Parametric statistical approaches often face limitations due to strict assumptions, making nonparametric deep learning methods a more flexible alternative. This study proposes the application of a Transformer-based deep learning model to predict the price of Sumatran Light Crude Oil (SLC), utilizing a self-attention mechanism to capture long-term dependencies in time series data. Experiments were conducted by evaluating various configurations of multi-head attention and number of layers, while keeping the model dimensions and input-output windows consistent. The results show that the Transformer configuration with 16 heads and 4 layers provides the best performance with a Root Mean Square Error (RMSE) value of 8.19818. These findings indicate that Transformer is capable of effectively modeling long-term trends in SLC prices, although its sensitivity to short-term fluctuations is still limited. The main contribution of this research lies in the use of Transformer as an alternative approach to forecasting crude oil prices in Indonesia, which was previously dominated by statistical methods and recurrent models. In practical terms, the results of this study provide a basis for the development of a more adaptive oil price forecasting system to support energy analysis and data-driven decision making
References
Farsani, R. M., & Pazouki, E. (2021). A transformer self-attention model for time series forecasting. Journal of Electrical and Computer Engineering Innovations (JECEI), 9(1), 1–10.
Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT press.
Harrou, F., Sun, Y., Hering, A. S., & Madakyaru, M. (2020). Statistical process monitoring using advanced data-driven and deep learning approaches: theory and practical applications. Elsevier.
Hochreiter, S., & Schmidhuber, J. (1997). Long Short-Term Memory. Neural Computation, 9(8), 1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735
Kingma, D. P., & Ba, J. (2014). Adam: A method for stochastic optimization. ArXiv Preprint ArXiv:1412.6980.
Li, S., Jin, X., Xuan, Y., Zhou, X., Chen, W., Wang, Y.-X., & Yan, X. (2019). Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Advances in Neural Information Processing Systems, 32.
Lim, B., Arik, S. O., Loeff, N., & Pfister, T. (2019). Temporal fusion transformers for interpretable multi-horizon time series forecasting. ArXiv Preprint ArXiv:1912.09363.
Pal, G., Li, G., & Atkinson, K. (2018). Big Data Real Time Ingestion and Machine Learning. https://doi.org/10.1109/DSMP.2018.8478598
Pandey, S., & Jain, M. (2021). Time Series Visualization using Transformer for Prediction of Natural Catastrophe.
Pariaman, H., Luciana, G. M., Wisyaldin, M. K., & Hisjam, M. (2021). Anomaly Detection Using LSTM-Autoencoder to Predict Coal Pulverizer Condition on Coal-Fired Power Plant.
Penetapan Harga Minyak Mentah Januari 2022, Pub. L. No. 11.K/MG.03/DJM/2022 (2022).
Sholikhah, A. (2022). Pengembangan Model Mixture Autoregressive (STUDI KASUS: Peramalan Harga Sumatran Light Crude). Institut Teknologi Sepuluh Nopember.
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł., & Polosukhin, I. (2017). Attention is all you need. Advances in Neural Information Processing Systems, 30.
Wen, Q., Zhou, T., Zhang, C., Chen, W., Ma, Z., Yan, J., & Sun, L. (2022). Transformers in Time Series: A Survey. ArXiv Preprint ArXiv:2202.07125.
Wu, N., Green, B., Ben, X., & O’Banion, S. (2020). Deep transformer models for time series forecasting: The influenza prevalence case. ArXiv Preprint ArXiv:2001.08317.
Wu, S., Xiao, X., Ding, Q., Zhao, P., Wei, Y., & Huang, J. (2020). Adversarial sparse transformer for time series forecasting. Advances in Neural Information Processing Systems, 33, 17105–17115.
Yi, S., Chen, X., & Tang, C. (2021). Tsformer: Time series Transformer for tourism demand forecasting. ArXiv Preprint ArXiv:2107.10977.
Zhao, R., Yan, R., Wang, J., & Mao, K. (2017). Learning to monitor machine health with convolutional bi-directional LSTM networks. Sensors, 17(2), 273.
Zhou, H., Zhang, S., Peng, J., Zhang, S., Li, J., Xiong, H., & Zhang, W. (2021). Informer: Beyond efficient transformer for long sequence time-series forecasting. Proceedings of AAAI.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Ni Luh Putu Ika Candrawengi, Yadhurani Dewi Amritha, Md. Wira Putra Dananjaya

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution-ShareAlike 4.0 International License that allows others to share the work with an acknowledgment of the work’s authorship and initial publication in this journal
















