# phidata **Repository Path**: lict1986/phidata ## Basic Information - **Project Name**: phidata - **Description**: No description available - **Primary Language**: Python - **License**: MPL-2.0 - **Default Branch**: abc_pyagent - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 1 - **Created**: 2024-07-20 - **Last Updated**: 2025-07-08 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README
⭐️ for when you need to spin up an AI project quickly.
## ⭐ Features: - **Powerful:** Get a production-ready LLM App with 1 command. - **Simple**: Built using a human-like `Conversation` interface to language models. - **Production Ready:** Your app can be deployed to aws with 1 command. ## 🚀 How it works - Create your codebase using a template: `phi ws create` - Run your app locally: `phi ws up dev:docker` - Run your app on AWS: `phi ws up prd:aws` ## 💻 Example: Build a RAG LLM App Let's build a **RAG LLM App** with GPT-4. We'll use PgVector for Knowledge Base and Storage and serve the app using Streamlit and FastApi. Read the full tutorial here. > Install docker desktop to run this app locally. ### Installation Open the `Terminal` and create an `ai` directory with a python virtual environment. ```bash mkdir ai && cd ai python3 -m venv aienv source aienv/bin/activate ``` Install phidata ```bash pip install phidata ``` ### Create your codebase Create your codebase using the `llm-app` template pre-configured with FastApi, Streamlit and PgVector. Use this codebase as a starting point for your LLM product. ```bash phi ws create -t llm-app -n llm-app ``` This will create a folder named `llm-app` ### Serve your LLM App using Streamlit Streamlit allows us to build micro front-ends for our LLM App and is extremely useful for building basic applications in pure python. Start the `app` group using: ```bash phi ws up --group app ``` **Press Enter** to confirm and give a few minutes for the image to download (only the first time). Verify container status and view logs on the docker dashboard. ### Example: Chat with PDFs - Open localhost:8501 to view streamlit apps that you can customize and make your own. - Click on **Chat with PDFs** in the sidebar - Enter a username and wait for the knowledge base to load. - Choose the `RAG` Conversation type. - Ask "How do I make chicken curry?" - Upload PDFs and ask questions