Using Neural Networks to Model Complex Mathematical Functions

Main Article Content

Abdelfatah kouidere
Mondher Damak

Abstract

Accurately modeling highly complex and irregular mathematical functions like fractals, chaos, and turbulence poses longstanding challenges. Traditional physics-based approaches often fail due to analytic intractability and extreme sensitivity. In this work, we pioneer the usage of long short-term memory (LSTM) recurrent neural networks for learning representations of such complex mathematical functions. We train custom-designed deep LSTM architectures on functions including the Lorenz attractor, Mandelbrot set, and Mackey-Glass delay differential equation. The networks achieve excellent quantitative performance across critical evaluation metrics like mean squared error and R-squared. Qualitative visualizations also demonstrate highly precise function replication and generalization. Comparisons to polynomial regression and multilayer perceptron baselines confirm the superiority of the LSTM modeling approach. Our results provide broader evidence for the potential of deep neural networks to emulate intricate mathematical functions, with transformative implications for overcoming modeling limitations across the sciences and engineering. We conclude by reflecting on current methodological limitations and identifying key open questions to guide future work at the intersection of mathematical modeling and machine learning.

Downloads

Download data is not yet available.

Article Details

How to Cite
kouidere , A., & Damak, M. (2022). Using Neural Networks to Model Complex Mathematical Functions. Mesopotamian Journal of Big Data, 2022, 51–54. https://doi.org/10.58496/MJBD/2022/007
Section
Articles