Nemotron-3-Super-120B-A12B vs Nemotron-Cascade 2

Data-driven comparison powered by the gentic.news knowledge graph

Nemotron-3-Super-120B-A12B: rising
Nemotron-Cascade 2: stable
competes with (3 sources)

Nemotron-3-Super-120B-A12B

ai model

METRIC

Nemotron-Cascade 2

ai model

1
Total Mentions
1
1
Last 30 Days
1
1
Last 7 Days
0
rising
Momentum
stable
Positive (+0.50)
Sentiment (30d)
Positive (+0.50)
Mar 22, 2026
First Covered
Mar 20, 2026

Ecosystem

Nemotron-3-Super-120B-A12B

No mapped relationships

Nemotron-Cascade 2

competes withQwen 3.5 Medium11 sources
usesMixture-of-Experts9 sources
competes withNemotron-3-Super-120B-A12B3 sources
usesLiveCodeBench v61 sources
usesInternational Olympiad in Informatics 20251 sources
usesHugging Face Hub1 sources

Nemotron-3-Super-120B-A12B

NVIDIA's Nemotron-3-Super-120B-A12B is a 120-billion parameter open hybrid MoE model that activates only 12B parameters for efficiency, combining Mamba and Transformer layers for advanced reasoning in agentic systems.

Nemotron-Cascade 2

NVIDIA's Nemotron-Cascade 2 is an open 30B parameter Mixture-of-Experts model with only 3B active parameters, achieving top-tier mathematical reasoning and coding performance.

Recent Events

Nemotron-3-Super-120B-A12B

No timeline events

Nemotron-Cascade 2

2026-03-22

Achieved Gold Medal-level performance on 2025 International Mathematical Olympiad, International Olympiad in Informatics, and ICPC World Finals

2026-03-20

Achieved 'gold medal performance' on IMO 2025 and IOI 2025 benchmarks

2025-01-01

Achieved Gold Medal-level performance on 2025 International Mathematical Olympiad, International Olympiad in Informatics, and ICPC World Finals

Related Comparisons

Nemotron-3-Super-120B-A12B Profile|Nemotron-Cascade 2 Profile|Knowledge Graph