Bulk-boundary decomposition of neural networks
Bulk-boundary decomposition of neural networks
A novel framework termed bulk-boundary decomposition has been introduced to improve the understanding of deep neural network training processes. This approach reorganizes the Lagrangian function into two distinct components: a bulk term and a boundary term. The bulk term is data-independent and encapsulates the inherent architecture of the neural network, while the boundary term is data-dependent and accounts for stochastic interactions during training. By separating these aspects, the framework provides a clearer perspective on how network structure and data-driven randomness contribute to learning dynamics. This decomposition aligns with recent efforts to analyze neural networks through theoretical lenses, as noted in related arXiv research. Overall, the bulk-boundary decomposition offers a structured way to dissect the complex interplay between network design and data influence in deep learning models.
