From Scratch to Fine-Tuned: A Comparative Study of Transformer Training Strategies for Legal Machine Translation
PositiveArtificial Intelligence
- A recent study has demonstrated the effectiveness of Transformer-based approaches in Legal Machine Translation (L-MT) for English-Hindi translation, addressing language barriers in India's legal system. The research, part of the JUST-NLP 2025 shared task, involved fine-tuning a pre-trained OPUS-MT model and training a model from scratch, with the fine-tuned model achieving a SacreBLEU score of 46.03, significantly outperforming the baseline.
- This development is crucial as it enhances access to legal information for non-English speakers in India, potentially transforming the legal landscape by making judicial documentation more accessible and understandable. The successful implementation of L-MT could lead to broader applications in various legal contexts, improving communication and understanding in legal proceedings.
- The study highlights ongoing efforts to leverage AI in the legal domain, paralleling other initiatives in India aimed at enhancing legal AI capabilities, such as judgment prediction and document summarization. These advancements reflect a growing recognition of the importance of AI in bridging language gaps and improving the efficiency of legal processes in multilingual societies.
— via World Pulse Now AI Editorial System


