Skip to content

pardeep-tm/question-answering-bot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Question Answering Bot using Langchain and OpenAI

Introduction

This project implements a custom question-answering chatbot powered by LangChain and provides an API via FastAPI for access.

Tools Used

  • Langchain: Framework for building conversational AI systems.
  • FastAPI: FastAPI is a modern, fast (high-performance), web framework for building APIs with Python

Getting Started

To run the project locally, follow these steps:

  1. Clone the repository to your local machine.
  2. Install the necessary dependencies listed in the requirements.txt file.
  3. Run the application by executing uvicorn app.main:app --port {} --reload in your terminal.
  4. The /qa endpoint needs to be passed two files as an input.
    1. a JSON file containing a list of questions that needs to be answered using LLM. e.g. {"questions": ["question1", "question2" ...]}
    2. a JSON or PDF having the content over which questions need to be answered. In case of a JSON file, it must have a content like this {"content": "long piece of text ..."}
  5. Use the sample input files located in the tests folder to test the API.

About

Question answering bot using Langchain and FastAPI

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages