Schauder Bases for $C[0, 1]$ Using ReLU, Softplus and Two Sigmoidal Functions

arXiv — cs.LGWednesday, December 10, 2025 at 5:00:00 AM
  • Researchers have constructed four Schauder bases for the function space $C[0,1]$, utilizing ReLU functions, Softplus functions, and two sigmoidal variations. This marks the first establishment of a basis using these functions, enhancing their universal approximation properties. An $O( rac{1}{n})$ approximation bound is also demonstrated based on the ReLU basis, alongside a negative result regarding the construction of multivariate functions with finite ReLU combinations.
  • The introduction of these Schauder bases is significant as it provides new methodologies for approximating functions in $C[0,1]$, which could lead to advancements in various applications, particularly in neural networks and functional analysis. This development may influence how researchers approach function approximation and the design of neural network architectures.
  • This work aligns with ongoing efforts to improve activation functions in deep learning, as seen in recent advancements like the Variance-enhanced Learning Unit (VeLU), which addresses limitations of traditional functions such as ReLU. Additionally, studies on the complexity of one-dimensional ReLU deep neural networks reveal insights into their expressivity, indicating a growing interest in optimizing neural network performance through innovative mathematical frameworks.
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it

Recommended apps based on your readingExplore all apps
Continue Readings
Complexity of One-Dimensional ReLU DNNs
NeutralArtificial Intelligence
A recent study investigates the expressivity of one-dimensional ReLU deep neural networks (DNNs), revealing that the expected number of linear regions increases with the number of neurons in hidden layers. This research provides insights into the structure and capabilities of these networks, particularly in the infinite-width limit.