Hasty Briefsbeta

Fara-7B by Microsoft: An agentic small language model designed for computer use

15 days ago
  • #AI
  • #Microsoft
  • #Automation
  • Fara-7B is Microsoft's first agentic small language model (SLM) designed for computer use with 7 billion parameters.
  • It achieves state-of-the-art performance in its size class and competes with larger models.
  • Fara-7B interacts with computers using mouse and keyboard inputs, performing tasks like scrolling, typing, and clicking.
  • The model is trained using synthetic data from the Magentic-One multi-agent framework with 145K trajectories.
  • Fara-7B can automate tasks such as searching, form filling, booking, shopping, and job hunting.
  • It outperforms comparable models in benchmarks like WebVoyager, Online-M2W, DeepShop, and WebTailBench.
  • WebTailBench is a new evaluation benchmark with 609 tasks focusing on real-world, underrepresented scenarios.
  • Fara-7B can be deployed locally using VLLM or on Azure Foundry without needing GPU resources.
  • The model includes robust error handling, task verification, and supports LLM-as-a-judge evaluation.
  • Microsoft recommends using Fara-7B in a sandboxed environment for security and privacy.