In recent years, artificial intelligence (AI) has revolutionized personalized health and nutrition systems by integrating machine learning and natural language processing (NLP). This research introduces an AI-powered, weight-based meal recommendation and question-answering (QnA) chatbot system, developed using pre-trained transformer models (T5-base and T5-large). The system aims to assist users in receiving personalized dietary recommendations based on gender and BMI-based weight categories (Underweight, Normal weight, Overweight, Obesity) and offers interactive responses to common health-related queries. The chatbot engine is trained using a custom dataset of 30 question-answer pairs for each of the 8 user categories (based on gender and weight classification), stored and retrieved dynamically using MongoDB. The model leverages Transfer Learning with T5-base and T5-large, both fine-tuned for sequence-to-sequence tasks. Despite its lighter structure, T5-base remains suitable for low-resource devices with as little as 8GB RAM and integrated GPU, offering scalability and accessibility for broader deployment. The system is served through a Flask backend integrated with a responsive front-end developed using HTML, CSS, and JavaScript, enabling real-time user interaction. The final solution demonstrates the viability of transformer-based language models in healthcare QnA and recommendation systems with minimal computational overhead. This framework provides a blueprint for future developments in AI-powered nutrition systems and intelligent health assistants.
Introduction
1. Introduction
As lifestyle diseases like obesity and diabetes rise, there is a need for personalized, accessible dietary advice. Many struggle to choose appropriate meals based on their body type and lifestyle. This research introduces an AI-powered chatbot system that:
Provides weight-based meal recommendations, and
Offers an interactive health QnA interface
— using T5 transformer models and MongoDB.
???? 2. Key Features of the System
User Categorization: 8 categories based on gender and BMI:
Model inference is handled using a test.py script.
Queries are mapped to the appropriate BMI+gender category.
Decoding uses beam search for quality responses.
D. Deployment Workflow
User logs in → BMI calculated → Query passed to correct T5 model → Response generated and shown in frontend.
System runs smoothly on 8GB RAM and integrated GPU machines.
???? 5. System Architecture Flow
(Summarized steps):
User logs in and inputs data (gender, height, weight).
BMI is calculated and used to assign a user category.
User asks a question.
Flask backend routes query to appropriate T5 model.
Model generates a response.
MongoDB provides historical or template responses if needed.
Final answer returned via web UI.
???? 6. Results and Evaluation
A. Evaluation Metrics
Accuracy (%): Semantic correctness of answers
Repetition Rate: Word redundancy in output
Resource Usage: Memory and inference time
B. Model Comparison
Model
Parameters
Accuracy
Notes
T5-base
220M
90%
Lightweight, slight phrase repetition
T5-large
770M
93%
Better fluency, higher resource usage
C. Sample Output Example
Input: "What should I eat in the morning?" – Male_Overweight
T5-base: "You can eat oats with fruits and green tea for energy."
T5-large: "A bowl of oatmeal topped with berries and a boiled egg is ideal."
D. Inference Speed
T5-base: ~220ms/input
T5-large: ~420ms/input
E. Final Model Chosen
T5-base selected for deployment due to:
Lower resource demand
Sufficient accuracy (90%)
Real-time compatibility
???? 7. Key Takeaways
T5-based NLP models are highly effective for personalized QnA and recommendation systems.
Fine-tuning even on small domain-specific datasets (30 per category) yields robust results.
The system is lightweight, scalable, and feasible for real-world deployment, including on modest hardware.
Lays a strong foundation for future AI health assistants in nutrition and fitness.
Conclusion
In this research, we successfully developed an AI-Powered Weight-Based Meal Recommendation and QnA Chatbot using pre-trained T5 language models—specifically T5-base and T5-large. By leveraging the question-answering capabilities of transformer-based models, our system accurately responds to user queries based on their gender and BMI classification (Underweight, Normal Weight, Overweight, and Obesity for both Male and Female categories).
The system was trained on a custom dataset extracted from MongoDB containing approximately 30 QA pairs per category, making it highly focused and domain-specific. The results highlight that:
• T5-base provides a fast and lightweight solution with 90% accuracy, making it ideal for deployment on standard hardware (e.g., 8GB RAM and integrated GPU).
• T5-large, although more resource-intensive, achieved higher performance with 93% accuracy, thanks to its 770 million parameters, offering better contextual understanding and less semantic ambiguity.
References
[1] P. Shetty, \"Dietary guidelines and recommendations in the prevention and treatment of metabolic syndrome: an evidence-based approach,\" Nutrition Reviews, vol. 71, no. 3, pp. 122–135, 2013.
[2] S. Dey, A. Biswas, and M. Ghosh, \"A machine learning based personalized diet recommendation system,\" in Proc. 2nd Int. Conf. on Intelligent Computing, Instrumentation and Control Technologies (ICICICT), Kannur, India, 2019, pp. 738–743.
[3] C. Raffel et al., \"Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer,\" Journal of Machine Learning Research, vol. 21, no. 140, pp. 1–67, 2020.
[4] L. Y. Tan and K. S. Ong, \"Fine-tuning BERT for Question Answering on Medical Documents,\" in Proc. 2021 Int. Conf. on Machine Learning and Cybernetics (ICMLC), Shenzhen, China, 2021, pp. 240–245.
[5] A. Saha, S. Paul, and R. Mandal, \"BERT Based Interactive Chatbot for Health and Nutrition Guidance,\" in Proc. 11th Int. Conf. on Computing, Communication and Networking Technologies (ICCCNT), Kharagpur, India, 2020, pp. 1–6.
[6] H. Lin and M. Gao, \"A MongoDB-based Chatbot Architecture for Real-time User Queries,\" in Proc. 2022 Int. Conf. on Big Data and Smart Computing (BigComp), Jeju Island, South Korea, 2022, pp. 525–529.
[7] Google’s T5 (text to text Transfer Transformer), available in https://huggingface.co/docs/transformers/en/model_doc/t5.