AIResearch AIResearch
Back to articles
Science

Fractional Neural Networks Solve Complex Growth Equations

Mexican researchers pioneer AI approach to fractional differential equations using neural networks with Caputo derivatives

AI Research
March 26, 2026
4 min read
Fractional Neural Networks Solve Complex Growth Equations

In the evolving landscape of artificial intelligence, neural networks have proven remarkably versatile at tackling problems ranging from image recognition to natural language processing. Now, researchers from Universidad Autónoma de Guerrero in Mexico are pushing these computational tools into new mathematical territory: solving fractional differential equations that model complex growth patterns. Their innovative approach combines fractional calculus with neural network architecture to address equations that have long d traditional analytical s. This breakthrough could have significant for fields ranging from epidemiology to economics, where growth models with memory effects and non-local dynamics are increasingly important.

Fractional derivatives represent a sophisticated mathematical generalization of ordinary derivatives that capture non-local effects and long-term memory in dynamic systems. Unlike standard derivatives that depend only on local behavior, fractional derivatives incorporate information from the entire history of a function, making them particularly valuable for modeling processes with anomalous dynamics. The Mexican research team focused specifically on the Caputo derivative, defined as an integro-differential operator that generalizes ordinary differentiation. Their ology involved discretizing this Caputo derivative to make it computationally tractable within a neural network framework, creating what they term a Fractional Artificial Neural Network (FANN). This approach builds on the fundamental insight that neural networks can approximate continuous functions, extending this capability to fractional differential equations.

The researchers implemented their fractional neural network in the statistical software R, using gradient techniques and the Adam optimization algorithm for training. Their architecture typically included multiple hidden layers with varying numbers of neurons—for instance, their logistic growth model used six hidden layers with 8, 42, 64, 64, 42, and 8 neurons respectively. They employed sigmoid activation functions and developed a specialized loss function that incorporated both the fractional differential equation and initial conditions. The team tested their approach on three distinct growth models: a linear exponential growth equation, a nonlinear logistic growth model, and a logistic model with periodic harvesting. For each model, they compared their neural network approximations against analytical solutions where available, demonstrating remarkable accuracy across different fractional orders (α values ranging from 0.7 to 1).

Section of their paper reveals compelling visual evidence of the FANN's effectiveness. Figure 1 shows the fractional exponential growth approximations closely matching the analytical solutions for various α values, while Figure 3 demonstrates similar success with the more complex logistic growth model. Perhaps most impressively, their handled the challenging periodic harvesting model (shown in Figure 5), which adds a sinusoidal harvesting term to the logistic equation. The loss functions, displayed in Figures 2, 4, and 6, show consistent convergence during training across all models and α values. The researchers note that their neural network achieved 'satisfactory approximations' for all tested scenarios, validating ology's robustness even when analytical solutions were unavailable, as with fractional logistic equations where α < 1.

This research carries significant for scientific computing and mathematical modeling. By demonstrating that neural networks can effectively solve fractional differential equations, the team opens new possibilities for modeling complex systems with memory effects—from biological population dynamics to economic growth patterns with historical dependencies. The approach could prove particularly valuable in fields like epidemiology, where fractional models better capture the memory effects in disease spread, or in materials science for modeling viscoelastic materials. The researchers suggest their ology could extend to other fractional models beyond growth equations, potentially revolutionizing how scientists approach problems involving anomalous diffusion, fractional control systems, and other phenomena described by fractional calculus.

Despite these promising , the authors acknowledge several limitations and directions for future research. Their current implementation focuses specifically on growth models with initial value problems, leaving boundary value problems and more complex fractional partial differential equations as potential extensions. The computational efficiency of their approach, while improved over some traditional s, could benefit from further optimization, particularly for higher-dimensional problems. Additionally, the team notes that future work should explore different neural network architectures beyond their current feedforward design, potentially incorporating recurrent or convolutional elements. They also suggest investigating alternative activation functions and optimization algorithms that might improve convergence rates or accuracy for specific classes of fractional differential equations.

Original Source

Read the complete research paper

View on arXiv

About the Author

Guilherme A.

Guilherme A.

Former dentist (MD) from Brazil, 41 years old, husband, and AI enthusiast. In 2020, he transitioned from a decade-long career in dentistry to pursue his passion for technology, entrepreneurship, and helping others grow.

Connect on LinkedIn