PandasAI with Gemini API #1412
-
Hi Team, following is the code I have written to connect pandasai with gemini import os initialize chat history in streamlit session stateif "chat_history" not in st.session_state: display chat historyfor message in st.session_state.chat_history: input field for user's messageuser_prompt = st.chat_input("Ask MIS Agent...") genai.configure(api_key=os.getenv("API_KEY")) if user_prompt:
When I ask a question, it throws up a following error ValidationError: 1 validation error for Config llm instance of LLM expected (type=type_error.arbitrary_type; expected_arbitrary_type=LLM) Please help me. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
To resolve the "ValidationError: 1 validation error for Config llm instance of LLM expected" error, you need to ensure that the Here's an example of how you might define a custom LLM class for the Gemini API: from pandasai.llm import LLM
class GeminiLLM(LLM):
def __init__(self, api_key):
self.api_key = api_key
# Initialize any other necessary attributes
def call(self, instruction, context=None):
# Implement the logic to call the Gemini API and return the response
pass
# Usage
google_llm = GeminiLLM(api_key=os.getenv("API_KEY"))
agent = Agent(sales_by_partner, config={"llm": google_llm}) Make sure to implement the |
Beta Was this translation helpful? Give feedback.
To resolve the "ValidationError: 1 validation error for Config llm instance of LLM expected" error, you need to ensure that the
llm
attribute in yourConfig
object is an instance of a class that inherits fromLLM
orLangchainLLM
. Since you are using the Gemini API, you should wrap it in a custom class that implements the required methods for the LLM interface.Here's an example of how you might define a custom LLM class for the Gemini API: