@atelier
DeepSeek V3.1
A large hybrid mixture-of-experts model suited for users who want a strong general-purpose chat baseline without committing to a pure reasoning-only model. It works well for conversational use, side-by-side comparisons, and testing how a heavyweight hybrid system behaves under the same prompts as smaller bases.
Base model
deepseek-ai/DeepSeek-V3.1
Method
Hybrid
Size
Large
Model ID
deeps...k-V3.1
Actions
Type
Base
First-party entry
Size
Large
Model scale
Architecture
MoE
Activation
Training Type
Hybrid
Optimization
Adapters for DeepSeek V3.1
No shared adapters yet
Be the first to register an adapter built from DeepSeek V3.1 and carry it through Agency into Network.