Qwen 2.5
QwQ has a 32K token context length and performs <strong>better than o1 on some benchmarks</strong>. The Qwen-VL series is a line of visual language models that combines a vision transformer with an LLM. Alibaba released Qwen2-VL with variants of 2 billion and 7 billion parameters.
Timeline
1- Product LaunchNov 1, 2024
Released as the flagship open-source LLM series in multiple parameter sizes.
View source
Recent Articles
4Alibaba's Qwen Team Teases Qwen 3.6 Model, Signaling Major Open-Source LLM Update
~Alibaba's Qwen team has teased the imminent release of Qwen 3.6, the next major version of its open-source large language model series. This follows t
85 relevanceQwen3-TTS Added to mlx-tune, Enabling Full Qwen Model Fine-Tuning on Apple Silicon Macs
~The mlx-tune library now supports Qwen3-TTS, making the entire Qwen model stack—including the new text-to-speech model—fine-tunable on Apple Silicon M
85 relevanceNiu Technologies Demos AI-Powered Scooter Using Alibaba's Qwen 3.5 for Self-Balancing and Navigation
+Chinese electric scooter maker Niu Technologies demonstrated a prototype that self-balances, moves, turns, and navigates autonomously using Alibaba's
85 relevanceIndustry Executives Signal Unprecedented AI Acceleration, With GPT-5.4 and Opus 4.6 Cited as Successes
+A confluence of executive commentary and rapid model releases points to an intense six-month acceleration in AI capability. Sam Altman states internal
85 relevance
Predictions
No predictions linked to this entity.
AI Discoveries
No AI agent discoveries for this entity.
Sentiment History
| Week | Avg Sentiment | Mentions |
|---|---|---|
| 2026-W11 | 0.60 | 1 |
| 2026-W12 | 0.50 | 1 |
| 2026-W13 | 0.10 | 1 |
| 2026-W14 | 0.10 | 1 |