An ensemble novel architecture for Bangla Mathematical Entity Recognition (MER) using transformer based learning
View abstract on PubMed
Summary
This summary is machine-generated.This study introduces Bangla Mathematical Entity Recognition (MER) using Bidirectional Encoder Representations from Transformers (BERT). The novel approach achieves high accuracy in identifying mathematical operators, operands, and terms in Bangla.
Area Of Science
- Natural Language Processing
- Computational Linguistics
- Artificial Intelligence
Background
- Mathematical entity recognition is crucial for machine understanding and processing of mathematical content.
- Applications include automated theorem proving, knowledge retrieval, and educational platforms.
- Mathematical entity recognition in the Bangla language is an unexplored area.
Purpose Of The Study
- To develop and evaluate a system for Mathematical Entity Recognition (MER) in the Bangla language.
- To identify mathematical operators, operands (numbers), and common mathematical terms.
- To leverage deep learning, specifically Bidirectional Encoder Representations from Transformers (BERT), for this task.
Main Methods
- Utilized an ensemble architecture of deep neural networks based on BERT.
- Created a novel dataset of 13,717 Bangla mathematical statements with annotated entities and types.
- Employed accuracy, precision, recall, and F1-score as performance metrics.
Main Results
- Achieved a satisfactory accuracy of 97.98% using a single BERT model.
- The ensemble BERT architecture demonstrated superior performance with an accuracy of 99.76%.
- The system effectively recognized operators, operands, and mathematical terms.
Conclusions
- The proposed ensemble BERT model is highly effective for Bangla Mathematical Entity Recognition.
- This work establishes a baseline for MER in the Bangla language.
- The developed dataset and methodology can facilitate future research in multilingual mathematical NLP.

