Top Accurate Models for Handling Complex Arabic Linguistic Structures

Authors

  • Murteza Hanoon Tuama Department of Computer Techniques Engineering, Imam Al-Kadhum University College, Baghdad, Iraq
  • Wahhab Muslim mashloosh Department of Computer Techniques Engineering, Imam Al-Kadhum University College, Baghdad, Iraq
  • Yasir Mahmood Younus Department of Computer Techniques Engineering, Imam Al-Kadhum University College, Baghdad, Iraq

Keywords:

Natural Language Processing, Arabic language, Sentiment Analysis

Abstract

Arabic is a language rich enough in morphology, morphology but poor in corpus and syntax compare to English which makes the other Arabic Applications in fields such as Arabic Natural Language Processing (NLP) include Question Answering, Named Entity Recognition (NER) and Sentiment Analysis (SA). (QA). Nevertheless, the recent developments of transformer-based models have shown that language-specific BERT models, pre-trained on large corpora, achieve an overall better performance in Arabic comprehension. * They represent the new state of the art, providing excellent results on diverse NLP tasks. We introduce AraBERT in this work - a BERT model specifically constructed for Arabic, where we strive to bring BERT's success to Arabic language similar to as achieved for English. We assess AraBERT against the company Google's multilingual BERT and other cutting-edge techniques. The study found that the newly built AraBERT surpasses the most Arabic NLP.

Downloads

Published

2024-09-28

How to Cite

Tuama, M. H., mashloosh, W. M., & Younus , Y. M. (2024). Top Accurate Models for Handling Complex Arabic Linguistic Structures . American Journal of Engineering , Mechanics and Architecture (2993-2637), 2(9), 113–122. Retrieved from https://grnjournal.us/index.php/AJEMA/article/view/5806