Files
llama.cpp-GUI/inspect_config.py
coremaven 0356871946 Initial commit: llama.cpp Server GUI
A professional PyQt6-based GUI for managing llama.cpp server instances.

Features:
- Server binary and model file selection
- Comprehensive server options (host, port, context, GPU layers, etc.)
- Start/Stop controls with non-blocking operations
- Real-time server log viewer
- Profile management (save/load/delete configurations)
- Configuration persistence
- System tray support
- Auto-start option

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2025-12-12 19:00:43 -05:00

31 lines
889 B
Python
Executable File

#!/usr/bin/env python3
"""
Simple script to inspect the saved configuration file
"""
import json
from pathlib import Path
config_file = Path.home() / ".llama_server_gui_config.json"
if config_file.exists():
with open(config_file, 'r') as f:
config = json.load(f)
print("=== Configuration File Contents ===")
print(f"File: {config_file}")
print(f"\nLast Profile: {config.get('last_profile', 'None')}")
print(f"\nNumber of Profiles: {len(config.get('profiles', {}))}")
profiles = config.get('profiles', {})
if profiles:
print("\n=== Profiles ===")
for name, settings in profiles.items():
print(f"\nProfile: {name}")
for key, value in settings.items():
print(f" {key}: {value}")
else:
print("\nNo profiles saved yet")
else:
print(f"Configuration file not found: {config_file}")