On bounds for norms of reparameterized ReLU artificial neural network parameters: sums of fractional powers of the Lipschitz norm control the network parameter vector
NeutralArtificial Intelligence
A recent study published on arXiv discusses the bounds for norms of reparameterized ReLU artificial neural network (ANN) parameters. It establishes that the Lipschitz norm of the realization function of a feedforward fully-connected ReLU ANN can be bounded from above by sums of powers of the ANN parameter vector norm. The study also reveals that for shallow ANNs, the converse inequality holds true, and the upper bound is valid only when using the Lipschitz norm, not for Hölder or Sobolev-Slobodeckij norms.
— via World Pulse Now AI Editorial System