Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
ai-search-engineartificial-intelligencemachine-learningopen-source-ai-search-engineopen-source-perplexity-aiperplexicaperplexity-aisearch-enginesearxngsearxng-copilot
| .assets | ||
| .github | ||
| data | ||
| db | ||
| docs | ||
| public | ||
| searxng | ||
| src | ||
| supabase | ||
| ui | ||
| uploads | ||
| .dockerignore | ||
| .env.example | ||
| .gitignore | ||
| .prettierignore | ||
| .prettierrc.js | ||
| app.dockerfile | ||
| backend.dockerfile | ||
| config.toml | ||
| CONTRIBUTING.md | ||
| docker-compose.yaml | ||
| docker-compose.yml | ||
| drizzle.config.ts | ||
| jest.config.js | ||
| LICENSE | ||
| package-lock.json | ||
| package.json | ||
| README.md | ||
| sample.config.toml | ||
| tsconfig.json | ||
| yarn.lock | ||
BizSearch
A tool for finding and analyzing local businesses using AI-powered data extraction.
Prerequisites
- Node.js 16+
- Ollama (for local LLM)
- SearxNG instance
Installation
- Install Ollama:
# On macOS
brew install ollama
- Start Ollama:
# Start and enable on login
brew services start ollama
# Or run without auto-start
/usr/local/opt/ollama/bin/ollama serve
- Pull the required model:
ollama pull mistral
- Clone and set up the project:
git clone https://github.com/yourusername/bizsearch.git
cd bizsearch
npm install
- Configure environment:
cp .env.example .env
# Edit .env with your settings
- Start the application:
npm run dev
- Open http://localhost:3000 in your browser
Troubleshooting
If Ollama fails to start:
# Stop any existing instance
brew services stop ollama
# Wait a few seconds
sleep 5
# Start again
brew services start ollama
To verify Ollama is running:
curl http://localhost:11434/api/version
Features
- Business search with location filtering
- Contact information extraction
- AI-powered data validation
- Clean, user-friendly interface
- Service health monitoring
Configuration
Key environment variables:
SEARXNG_URL: Your SearxNG instance URLOLLAMA_URL: Ollama API endpoint (default: http://localhost:11434)SUPABASE_URL: Your Supabase project URLSUPABASE_ANON_KEY: Your Supabase anonymous keyCACHE_DURATION_DAYS: How long to cache results (default: 7)
Supabase Setup
- Create a new Supabase project
- Run the SQL commands in
db/init.sqlto create the cache table - Copy your project URL and anon key to
.env
License
MIT
Cache Management
The application uses Supabase for caching search results. Cache entries expire after 7 days.
Manual Cache Cleanup
If automatic cleanup is not available, you can manually clean up expired entries:
- Using the API:
curl -X POST http://localhost:3000/api/cleanup
- Using SQL:
select manual_cleanup();
Cache Statistics
View cache statistics using:
select * from cache_stats;