Hasty Briefsbeta

Bilingual

Transformer Lab

a year ago
  • #LLM
  • #AI
  • #OpenSource
  • Transformer Lab is supported by Mozilla through the Mozilla Builders Program.
  • It is an open-source platform for building, tuning, and running Large Language Models (LLMs) locally without coding.
  • Aims to enable software developers to integrate LLMs into their products without Python or ML expertise.
  • Features include one-click downloads of popular models like Llama3, Phi3, Mistral, and more from Huggingface.
  • Supports finetuning across different hardware, including Apple Silicon (MLX) and GPUs (Huggingface).
  • Offers RLHF and preference optimization techniques like DPO, ORPO, SIMPO, and reward modeling.
  • Compatible with Windows, MacOS, and Linux operating systems.
  • Provides chat functionalities, preset prompts, chat history, and batch inference.
  • Supports multiple inference engines: MLX, Huggingface Transformers, vLLM, and Llama CPP.
  • Includes evaluation tools like Eleuther Harness, LLM as a Judge, and red teaming evals.
  • Features RAG (Retrieval Augmented Generation) with drag-and-drop file support.
  • Allows dataset building from HuggingFace or custom datasets via drag-and-drop.
  • Offers embedding calculations, a full REST API, and cloud or local deployment options.
  • Supports model conversion across platforms (Huggingface, MLX, GGUF) and plugin extensions.
  • Includes prompt editing, system message adjustments, and inference logs for transparency.