agentchat.contrib.graph_rag.neo4j_graph_query_engine
Neo4jGraphQueryEngine
This class serves as a wrapper for a property graph query engine backed by LlamaIndex and Neo4j, facilitating the creating, connecting, updating, and querying of LlamaIndex property graphs.
It builds a property graph Index from input documents, storing and retrieving data from the property graph in the Neo4j database.
It extracts triplets, i.e., [entity] -> [relationship] -> [entity] sets, from the input documents using llamIndex extractors.
Users can provide custom entities, relationships, and schema to guide the extraction process.
If strict is True, the engine will extract triplets following the schema of allowed relationships for each entity specified in the schema.
It also leverages LlamaIndex’s chat engine which has a conversation history internally to provide context-aware responses.
For usage, please refer to example notebook/agentchat_graph_rag_neo4j.ipynb
__init__
Initialize a Neo4j Property graph. Please also refer to https://docs.llamaindex.ai/en/stable/examples/property_graph/graph_store/
Arguments:
name
str - Property graph name.host
str - Neo4j hostname.port
int - Neo4j port number.database
str - Neo4j database name.username
str - Neo4j username.password
str - Neo4j password.llm
LLM - Language model to use for extracting triplets.embedding
BaseEmbedding - Embedding model to use constructing index and queryentities
Optional[TypeAlias] - Custom suggested entities to include in the graph.relations
Optional[TypeAlias] - Custom suggested relations to include in the graph.schema
Optional[Union[Dict[str, str], List[Triple]] - Custom schema to specify allowed relationships for each entity.strict
Optional[bool] - If false, allows for values outside of the input schema.
init_db
Build the knowledge graph with input documents.
connect_db
Connect to an existing knowledge graph database.
add_records
Add new records to the knowledge graph. Must be local files.
Arguments:
new_records
List[Document] - List of new documents to add.
Returns:
bool
- True if successful, False otherwise.
query
Query the property graph with a question using LlamaIndex chat engine. We use the condense_plus_context chat mode which condenses the conversation history and the user query into a standalone question, and then build a context for the standadlone question from the property graph to generate a response.
Arguments:
question
- a human input question.n_results
- number of results to return.
Returns:
A GrapStoreQueryResult object containing the answer and related triplets.