Transformer Semantic Genetic Programming for d-dimensional Symbolic Regression Problems

arXiv — cs.LGThursday, November 13, 2025 at 5:00:00 AM
The introduction of Transformer Semantic Genetic Programming (TSGP) marks a significant advancement in the field of symbolic regression. By leveraging a pre-trained transformer model, TSGP generates offspring programs that maintain controlled semantic similarity to their parent programs. This innovative approach was evaluated across 24 real-world and synthetic datasets, where TSGP demonstrated superior performance compared to standard genetic programming methods, including SLIM_GSGP, Deep Symbolic Regression, and Denoising Autoencoder GP, achieving an impressive average rank of 1.58. Furthermore, TSGP not only outperformed these methods in accuracy but also produced more compact solutions, addressing a critical need for efficiency in program generation. The mechanism of target semantic distance ($\mathrm{SD}_t$) allows for effective control over the exploration and exploitation balance in the semantic space, further enhancing the method's adaptability and performance.
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it