This project implements a custom question-answering chatbot powered by LangChain and provides an API via FastAPI for access.
- Langchain: Framework for building conversational AI systems.
- FastAPI: FastAPI is a modern, fast (high-performance), web framework for building APIs with Python
To run the project locally, follow these steps:
- Clone the repository to your local machine.
- Install the necessary dependencies listed in the
requirements.txtfile. - Run the application by executing
uvicorn app.main:app --port {} --reloadin your terminal. - The
/qaendpoint needs to be passed two files as an input.- a JSON file containing a list of questions that needs to be answered using LLM. e.g.
{"questions": ["question1", "question2" ...]} - a JSON or PDF having the content over which questions need to be answered. In case of a JSON file, it must have a content like this
{"content": "long piece of text ..."}
- a JSON file containing a list of questions that needs to be answered using LLM. e.g.
- Use the sample input files located in the
testsfolder to test the API.