- Godson Chetachi Uzoaru, Ignatius Ikechukwu Ayogu, Aloysius C. Onyeka, Juliet Odii
- DOI: 10.5281/zenodo.15857587
Building chatbots in languages with limited resources is challenging because there are few labelled datasets, many language differences, and limited computing power. Attention mechanisms and contextual modelling help chatbots understand language better, stay coherent, and respond more accurately in long conversations. This paper reviews the newest chatbot models that use attention-based methods. It discusses different designs, including Transformer-based models, memory-supported networks, and combined approaches. The review also highlights major challenges such as difficult grammar, biased responses, and ethical problems. To solve these issues, the paper examines solutions like learning from a few examples, using knowledge from other models, and gathering data from different communities. By studying various chatbot applications, the review identifies important trends and best practices to make chatbots more inclusive, effective, and fair. These findings will help researchers and developers improve chatbot technology, especially for languages with limited resources.