RAG powered chatbot to streamline technical support and knowledge retrieval.
Designed and built the full Retrieval Augmented Generation (RAG) pipeline to generate context aware responses based on internal data.
Integrated a vector database to enable fast and relevant semantic search across historical tickets and metadata.
Processed and structured raw data, filtering only the necessary information to keep the knowledge base relevant and usable.
Users can ask troubleshooting questions in plain language without relying on rigid keyword filters.
Implements intent based routing to determine whether to retrieve past ticket resolutions or application related information.
Each response includes references to original data sources to reduce hallucinations and improve trust.
LLMs struggled to understand raw tabular data from Excel.
Built a custom parser to transform tables into a structured narrative format based on how domain experts interpret the data.
Improved data usability and made tabular information retrievable within the RAG pipeline.
Retrieving relevant results for technical queries and past ticket resolutions was inconsistent.
Implemented hybrid search combining embeddings and keyword based retrieval, and benchmarked multiple embedding models.
Achieved more accurate and stable retrieval for both keyword based and natural language queries.
Needed a fast and reliable LLM that could follow strict guidelines and avoid hallucinations.
Evaluated several models and designed system prompts with guardrails to ensure responses stayed grounded in retrieved data.
Improved response consistency, reduced hallucinations, and maintained a consistent tone.
Initially started as a research experiment and later approved for full development after being presented in an internal team meeting.
Developed into a functional application and deployed in the internal staging environment.
The application is now used by the Support team to find relevant ticket resolutions more efficiently, reducing manual search effort and helping resolve issues faster.