Perplexity Eyes Post-Training with Moonlit's Kimi K2 Model for AI Enhancement
Moonlit AI, a Beijing-based AI research firm, has unveiled its trillion-parameter Kimi K2 model, a MoE (Mixture-of-Experts) architecture with 1T total parameters and 32B activated parameters. The model demonstrates exceptional capabilities in code generation and general-purpose Agent tasks, marking a significant advancement in open-source AI technology. This development has drawn attention from U.S. AI startup Perplexity, whose CEO has hinted at potential post-training on K2 to boost product performance.
1 minute read