SZ84.NET
Back to List
D
FlagshipMoESOTA

DeepSeek-V3.2

The latest 685B flagship MoE model from DeepSeek (Dec 2025), significantly improving inference efficiency and reasoning depth.

Details

The latest 685B flagship MoE model from DeepSeek (Dec 2025), significantly improving inference efficiency and reasoning depth.

Key Features

Next-gen Sparse Attention

One of the core capabilities of this tool

128K context window

One of the core capabilities of this tool

70% lower inference cost

One of the core capabilities of this tool

Use Cases

1

Enterprise AI infrastructure

2

Large-scale data analysis

3

Complex agentic tasks

Visit Official Website
Pricing
free
Category
open-source
Tags
FlagshipMoESOTA