Perplexica/src/lib/config.ts
TamHC 883e457009
implemented the history retention feature
### 1. __History Retention Configuration__

- __config.toml__: Added `[HISTORY]` section with `RETENTION_DAYS = 30` setting
- __Backend Integration__: Updated configuration handling to support history retention
- __API Endpoints__: Modified `/api/config` to read/write history retention settings

### 2. __User Interface__

- __Settings Page__: Added "History Settings" section with number input for retention days
- __Real-time Updates__: Settings are saved to config.toml when changed
- __Clear Documentation__: Explains that retention only applies when incognito mode is off

### 3. __Automatic History Cleanup__

- __Background Processing__: Cleanup runs automatically when new chats are created (non-incognito mode)
- __Smart Logic__: Only deletes chats older than configured retention period
- __Complete Cleanup__: Removes both chat records and associated messages
- __Performance Optimized__: Non-blocking background execution

### 4. __Manual Cleanup API__

- __Endpoint__: `POST /api/cleanup-history` for manual cleanup triggers
- __Utility Functions__: Reusable cleanup logic in dedicated utility file

### 5. __Docker Rebuild__

- __Container Rebuild__: Successfully rebuilt the Docker containers with new features
- __Configuration Persistence__: config.toml changes are preserved in Docker volume
- __Application Ready__: The application should now be accessible at [](http://localhost:3000)<http://localhost:3000>

## Key Features:

1. __Incognito Mode Integration__: History retention only applies when incognito mode is OFF
2. __Flexible Configuration__: 0 = keep forever, any positive number = days to retain
3. __Automatic Cleanup__: Runs in background when creating new chats
4. __Manual Control__: API endpoint for manual cleanup triggers
5. __Database Integrity__: Properly removes both chats and associated messages

## Testing the Feature:

1. __Access the Application__: Open [](http://localhost:3000)<http://localhost:3000> in your browser
2. __Configure Settings__: Go to Settings → History Settings → Set retention days
3. __Test Incognito Mode__: Toggle incognito mode on/off to see different behaviors
4. __Create Test Chats__: Create chats in both modes to verify functionality
5. __Manual Cleanup__: Use the `/api/cleanup-history` endpoint to test manual cleanup
2025-06-24 18:26:18 +08:00

146 lines
3.5 KiB
TypeScript

import toml from '@iarna/toml';
// Use dynamic imports for Node.js modules to prevent client-side errors
let fs: any;
let path: any;
if (typeof window === 'undefined') {
// We're on the server
fs = require('fs');
path = require('path');
}
const configFileName = 'config.toml';
interface Config {
GENERAL: {
SIMILARITY_MEASURE: string;
KEEP_ALIVE: string;
};
HISTORY: {
RETENTION_DAYS: number;
};
MODELS: {
OPENAI: {
API_KEY: string;
};
GROQ: {
API_KEY: string;
};
ANTHROPIC: {
API_KEY: string;
};
GEMINI: {
API_KEY: string;
};
OLLAMA: {
API_URL: string;
};
DEEPSEEK: {
API_KEY: string;
};
LM_STUDIO: {
API_URL: string;
};
CUSTOM_OPENAI: {
API_URL: string;
API_KEY: string;
MODEL_NAME: string;
};
};
API_ENDPOINTS: {
SEARXNG: string;
};
}
type RecursivePartial<T> = {
[P in keyof T]?: RecursivePartial<T[P]>;
};
const loadConfig = () => {
// Server-side only
if (typeof window === 'undefined') {
return toml.parse(
fs.readFileSync(path.join(process.cwd(), `${configFileName}`), 'utf-8'),
) as any as Config;
}
// Client-side fallback - settings will be loaded via API
return {} as Config;
};
export const getSimilarityMeasure = () =>
loadConfig().GENERAL.SIMILARITY_MEASURE;
export const getKeepAlive = () => loadConfig().GENERAL.KEEP_ALIVE;
export const getHistoryRetentionDays = () => loadConfig().HISTORY.RETENTION_DAYS;
export const getOpenaiApiKey = () => loadConfig().MODELS.OPENAI.API_KEY;
export const getGroqApiKey = () => loadConfig().MODELS.GROQ.API_KEY;
export const getAnthropicApiKey = () => loadConfig().MODELS.ANTHROPIC.API_KEY;
export const getGeminiApiKey = () => loadConfig().MODELS.GEMINI.API_KEY;
export const getSearxngApiEndpoint = () =>
process.env.SEARXNG_API_URL || loadConfig().API_ENDPOINTS.SEARXNG;
export const getOllamaApiEndpoint = () => loadConfig().MODELS.OLLAMA.API_URL;
export const getDeepseekApiKey = () => loadConfig().MODELS.DEEPSEEK.API_KEY;
export const getCustomOpenaiApiKey = () =>
loadConfig().MODELS.CUSTOM_OPENAI.API_KEY;
export const getCustomOpenaiApiUrl = () =>
loadConfig().MODELS.CUSTOM_OPENAI.API_URL;
export const getCustomOpenaiModelName = () =>
loadConfig().MODELS.CUSTOM_OPENAI.MODEL_NAME;
export const getLMStudioApiEndpoint = () =>
loadConfig().MODELS.LM_STUDIO.API_URL;
const mergeConfigs = (current: any, update: any): any => {
if (update === null || update === undefined) {
return current;
}
if (typeof current !== 'object' || current === null) {
return update;
}
const result = { ...current };
for (const key in update) {
if (Object.prototype.hasOwnProperty.call(update, key)) {
const updateValue = update[key];
if (
typeof updateValue === 'object' &&
updateValue !== null &&
typeof result[key] === 'object' &&
result[key] !== null
) {
result[key] = mergeConfigs(result[key], updateValue);
} else if (updateValue !== undefined) {
result[key] = updateValue;
}
}
}
return result;
};
export const updateConfig = (config: RecursivePartial<Config>) => {
// Server-side only
if (typeof window === 'undefined') {
const currentConfig = loadConfig();
const mergedConfig = mergeConfigs(currentConfig, config);
fs.writeFileSync(
path.join(path.join(process.cwd(), `${configFileName}`)),
toml.stringify(mergedConfig),
);
}
};