Hasty Briefsbeta

Vsora Jotunn-8 5nm European inference chip

13 days ago
  • #edge-computing
  • #AI-inference
  • #data-center
  • Modern data centers require deploying trained models with speed, cost-efficiency, and scalability.
  • Key factors in inference systems include high throughput, low latency, and optimized power consumption.
  • Jotunn8 is an ultra-high-performance inference chip designed for speed, cost-efficiency, and sustainability.
  • Critical applications include real-time AI services like chatbots, fraud detection, and search.
  • Generative AI, reasoning models, and agentic frameworks combine for more capable AI systems.
  • VSORA architecture enables seamless integration of these AI algorithms for near-theory performance.
  • Edge AI offers unmatched performance with fully programmable, algorithm-agnostic solutions.
  • RISC-V cores enable AI to run completely on-chip, improving efficiency.
  • Performance metrics include fp8: 3200 Tflops, fp16: 800 Tflops, and close-to-theory efficiency.