Use the new GPT-4 api to build a chatGPT chatbot for Large PDF docs (56 pages used in this example).
Tech stack used includes LangChain, Pinecone, Typescript, Openai, and Next.js. LangChain is a framework that makes it easier to build scalable AI/LLM apps and chatbots. Pinecone is a vectorstore for storing embeddings and your PDF in text to later retrieve similar docs.
Get in touch via twitter if you have questions
The visual guide of this repo and tutorial is in the visual guide
folder.
- Clone the repo
git clone [github https url]
- Install packages
pnpm install
- Set up your
.env
file
- Copy
.env.example
into.env
Your.env
file should look like this:
OPENAI_API_KEY=
PINECONE_API_KEY=
PINECONE_ENVIRONMENT=
- Visit openai to retrieve API keys and insert into your
.env
file. - Visit pinecone to create and retrieve your API keys.
-
In the
config
folder, replace thePINECONE_INDEX_NAME
andPINECONE_NAME_SPACE
with your own details from your pinecone dashboard. -
In
utils/makechain.ts
chain change theQA_PROMPT
for your own usecase. ChangemodelName
innew OpenAIChat
to a different api model if you don't have access togpt-4
. See the OpenAI docs for a list of supportedmodelName
s. For example you could usegpt-3.5-turbo
if you do not have access togpt-4
, yet.
-
In
docs
folder replace the pdf with your own pdf doc. -
In
scripts/ingest-data.ts
replacefilePath
withdocs/{yourdocname}.pdf
-
Run the script
npm run ingest
to 'ingest' and embed your docs -
Check Pinecone dashboard to verify your namespace and vectors have been added.
Once you've verified that the embeddings and content have been successfully added to your Pinecone, you can run the app npm run dev
to launch the local dev environment and then type a question in the chat interface.
Frontend of this repo is inspired by langchain-chat-nextjs