Use any LlamaIndex vector store as a Query Engine
LLamaIndexqueryEngine
for
retrieval-augmented question answering over documents. It shows how to
set up the engine with Docling parsed Markdown files, and execute
natural language queries against the indexed data.
The LlamaIndexQueryEngine
provides an efficient way to query vectorDBs
using any LlamaIndex’s vector
store.
We use some Markdown (.md) files as input, feel free to try your own
text or Markdown documents.
You can create and add this ChromaDBQueryEngine to
DocAgent
to use.
OPENAI_API_KEY
to be in your
environment variables. See our
documentation
for guidance.
chroma_vector_store
to create our AG2
LLamaIndexQueryEngine
instance.
PINECONE_API_KEY
.