Self Supervised Transformers for High Dimensional Time Series Anomaly Detection

Authors

  • Aswadi Jaya PGRI University Palembang
  • Derlina State University of Medan
  • Qurotul Aini Satya Wacana Christian University
  • Agung Rizky University of Raharja
  • Richard Evans Adi Journal Incorporation

DOI:

https://doi.org/10.34306/b-front.v6i1.1078

Keywords:

Self Supervised Learning, Transformer Models, Anomaly Detection, Deep Learning, Software Analytics

Abstract

This study addresses anomaly detection in high dimensional time series data within the context of Artificial Intelligence (AI) driven software development, where modern systems generate large temporal data streams and reliable monitoring remains difficult due to noise, complexity, and limited labeled anomalies. The objective of this research is to develop an effective and scalable anomaly detection framework based on self supervised transformer models that can learn meaningful temporal representations without heavy reliance on manual annotation. The proposed method applies self supervised pretraining through masked sequence reconstruction and contrastive temporal learning on large scale, unlabeled multivariate time series datasets, followed by transformer based attention mechanisms to capture long range dependencies and compute anomaly scores. Experiments are conducted using benchmark datasets and real world system log data implemented with Python based deep learning tools and transformer architectures to evaluate detection performance. The results indicate that the proposed approach improves detection accuracy and reduces false positive rates compared to traditional statistical techniques and supervised deep learning models, particularly in high dimensional and low label settings. In conclusion, integrating self supervised learning with transformer architectures provides a robust and generalizable solution for time series anomaly detection, contributing to software analytics and monitoring systems by lowering labeling costs and improving adaptability across application domains.

Downloads

Published

2026-03-26

How to Cite

Aswadi Jaya, Derlina, Qurotul Aini, Agung Rizky, & Richard Evans. (2026). Self Supervised Transformers for High Dimensional Time Series Anomaly Detection. Blockchain Frontier Technology, 6(1), 25∼37. https://doi.org/10.34306/b-front.v6i1.1078