Hasty Briefsbeta

Bilingual

Show HN: Clippy, 90s UI for local LLMs

a year ago
  • #offline
  • #LLM
  • #nostalgia
  • Clippy allows running various large language models (LLMs) locally with a 1990s-style interface.
  • Features a simple, classic chat interface for sending messages and receiving responses.
  • No complicated setup required; automatically optimizes model performance using llama.cpp and node-llama-cpp.
  • Supports custom models, prompts, and parameters for personalized use.
  • Operates offline and locally, with minimal network usage (only for optional update checks).
  • Not affiliated with Microsoft; blends nostalgia for 1990s tech with modern LLM capabilities.