Posts

Query your own documents with LlamaIndex and Gemini

Image
In this article I am going to explain about on creating application for indexing and querying your own documents using LlamaIndex and Gemini. I will provide step by step guide to create application in python.   Prerequisites for this example is as follows: Visual studio code Python Api Key of Gemini can be obtain from https://aistudio.google.com    Open visual studio code and create the file with name "demo.py". Now in visual studio code and go to terminal menu and click on New terminal link it will open new terminal. In terminal enter below command to install the LlamaIndex library and LlamaIndex gemini library in your machine. pip install llama-index llama-index-llms-gemini llama-index-embeddings-gemini Create the folder named "doc" in root directory of the application as shown in below image and store the documents you want to query.       Now copy below code and paste in the "demo.py" file. from llama_index.embeddings.gemini import GeminiEmbedd

Working with Langchain and google Gemini Integration step by step

Image
 In this article I am going to explain about how can we use Gemini AI model with Lanchain library in python by step by step. The langchain is powerful library framework by using it we can develop LLM app easily.     Prerequisites for this example is as follows: Visual studio code Python Api Key of Gemini can be obtain from https://aistudio.google.com    Open visual studio code and create the file with name "langchainexample.py" and add below code in the file.Now in visual studio code and go to terminal menu and click on New terminal link it will open new terminal. In terminal enter below command to install the Langchain library for google gen ai in your machine. pip install langchain-google-genai Now add below code in the "langchainexample.py" file. from langchain_google_genai import ChatGoogleGenerativeAI apikey="your api key" inputstr="" llm=ChatGoogleGenerativeAI(model="gemini-pro",google_api_key=apikey) while inputst