Multi-Objective Optimization of Regularized Self-Attention Regression Models Using NSGA-II: A Methodological Framework

QR Code

Maja CZYŻEWSKA

Military University of Technology, Warsaw, Poland

Abstract

Forecasting financial time series with deep neural networks requires models that are both expressive and well regularized. Regularized Self-Attention Regression (RSAR) is a hybrid architecture that combines LSTM, self-attention and convolutional components with explicit regularization, and has shown promising results in financial price prediction. However, its original formulation relies on a small number of manually selected hyperparameter configurations and a single-objective tuning strategy, which limits its adaptability to datasets with different volatility levels, noise structures and temporal dynamics. This paper addresses that gap by proposing a methodological framework that extends RSAR with multi-objective hyperparameter optimization based on an evolutionary genetic algorithm (NSGA-II). The framework defines a structured decision vector for key architectural and regularization parameters, formulates a bi-objective problem that jointly minimizes validation error and the train–validation gap, and outlines the optimization procedure built on non-dominated sorting, diversity preservation and elitist selection. The contribution is purely methodological: the paper provides a detailed description of the RSAR architecture, the underlying self-attention and regularization mechanisms, and their integration into a multi-objective search procedure. The resulting framework offers an implementation-ready basis for automatically adapting RSAR models to diverse financial markets, explicitly balancing predictive accuracy and overfitting risk. Experimental evaluation is intentionally omitted and will be presented in a subsequent study.

 

Keywords: Deep Neural Networks, LSTM, CNN, RSAR, Finance, Multi-objective optimization, NSGA-II
Shares