Sat, Sep 7, 2024
Nimble Search
ai
search
LLM
docker
Nimble Search is an open-source AI-powered search engine that leverages local and cloud-based LLMs to deliver intelligent search results.
AI-powered search engine. (Perplexity Clone)
Run local LLMs (llama3, gemma, mistral, phi3), custom LLMs through LiteLLM, or use cloud models (Groq/Llama3, OpenAI/gpt4-o).
Please feel free to contact me on 𝕏 or create an issue if you have any questions.
💻 Live Demo
nimble-search.vercel.app (Cloud models only)
📖 Overview
- 🛠️ Tech Stack
- 🏃🏿♂️ Getting Started
- 🚀 Deploy
🛣️ Roadmap
- Add support for local LLMs through Ollama
- Docker deployment setup
- Add support for searxng. Eliminates the need for external dependencies.
- Create a pre-built Docker Image
- Add support for custom LLMs through LiteLLM
- Chat History
- Expert Search
- Chat with local files
🛠️ Tech Stack
- Frontend: Next.js
- Backend: FastAPI
- Search API: SearXNG, Tavily, Serper, Bing
- Logging: Logfire
- Rate Limiting: Redis
- UI Components: shadcn/ui
Features
- Search with multiple search providers (Tavily, Searxng, Serper, Bing)
- Answer questions with cloud models (OpenAI/gpt4-o, OpenAI/gpt3.5-turbo, Groq/Llama3)
- Answer questions with local models (llama3, mistral, gemma, phi3)
- Answer questions with any custom LLMs through LiteLLM
- Search with an agent that plans and executes the search for better results
🏃🏿♂️ Getting Started Locally
Prerequisites
- Docker
- Ollama (for running local models)
- Download any of the supported models: llama3, mistral, gemma, phi3
- Start the Ollama server with
ollama serve
Get API Keys
Quick Start
git clone https://github.com/gauravmandall/nimble-search.git
cd nimble-search && cp .env-template .env
Modify the .env
file with your API keys (optional, not required if using Ollama).
Start the app:
docker-compose -f docker-compose.dev.yaml up -d
Wait for the app to start then visit http://localhost:3000.
For custom setup instructions, see custom-setup-instructions.md.
🚀 Deploy
Backend
After deploying the backend, copy the web service URL (e.g., https://your-backend.onrender.com
).
Frontend
Use the backend URL in the NEXT_PUBLIC_API_URL
environment variable when deploying with Vercel.
And you're done! 🥳
Use Nimble Search as Your Default Search Engine
To set Nimble Search as your default search engine:
- Visit your browser's settings.
- Navigate to 'Search Engines'.
- Create a new search engine entry using the URL:
http://localhost:3000/?q=%s
. - Add the search engine.
"Innovation is the ability to see change as an opportunity - not a threat."
Unknown