SPEAR-MM: Selective Parameter Evaluation and Restoration via Model Merging for Efficient Financial LLM Adaptation
PositiveArtificial Intelligence
SPEAR-MM, a new framework for adapting large language models (LLMs) to financial domains, addresses the issue of catastrophic forgetting, where models lose general reasoning abilities crucial for customer interactions and complex financial analysis. By employing a selective parameter evaluation and restoration method, SPEAR-MM achieves a remarkable 91.2% retention of general capabilities, significantly outperforming the 69.7% retention seen with standard continual pretraining. Additionally, it retains 94% of the gains from domain adaptation while reducing computational costs by 90%. This efficiency is particularly vital for resource-constrained financial institutions, enabling them to leverage advanced AI technologies without incurring prohibitive costs. The framework's interpretable trade-off control further enhances its applicability, allowing institutions to balance general and domain-specific capabilities effectively.
— via World Pulse Now AI Editorial System