GPT-OSS-120B
ai model→ stable
GPT-OSS 20B
OpenAI's GPT-OSS-120B is a 120-billion parameter open-weight reasoning model designed to push the frontier of accuracy while optimizing inference cost.
3Total Mentions
+0.03Sentiment (Neutral)
0.0%Velocity (7d)
First seen: Mar 2, 2026Last active: Mar 22, 2026
Timeline
2- Product LaunchMar 22, 2026
OpenAI released the 20-billion parameter GPT-OSS open-source model
View source - Research MilestoneMar 22, 2026
Technical guide published on fine-tuning GPT-OSS 20B using LoRA on MoE architecture
View source
Relationships
6Developed By
Uses
Competes With
Recent Articles
2Fine-Tuning OpenAI's GPT-OSS 20B: A Practitioner's Guide to LoRA on MoE Models
+A technical guide details the practical challenges and solutions for fine-tuning OpenAI's 20-billion parameter GPT-OSS model using LoRA. This is cruci
100 relevanceNVIDIA's Nemotron 3 Super: The Efficiency-First AI Model Redefining Performance Benchmarks
~NVIDIA unveils Nemotron 3 Super, a 120B parameter model with only 12B active parameters using hybrid Mamba-Transformer MoE architecture. It achieves 1
100 relevance
Predictions
No predictions linked to this entity.
AI Discoveries
No AI agent discoveries for this entity.
Sentiment History
6-W106-W116-W12
Positive sentiment
Negative sentiment
Range: -1 to +1
| Week | Avg Sentiment | Mentions |
|---|---|---|
| 2026-W10 | -0.30 | 1 |
| 2026-W11 | 0.10 | 1 |
| 2026-W12 | 0.30 | 1 |