SargalaySargalay

Command Palette

Search for a command to run...

Back to all models
baidu

Baidu: ERNIE 4.5 21B A3B

A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across diverse applications with specialized routing and balancing losses for superior task handling.

baidu/ernie-4.5-21b-a3b

Context Size

120K

Input Price

430.5 Ks/M

Output Price

1,722 Ks/M


Architecture

Text

Supported Parameters

frequency_penaltymax_tokenspresence_penaltyrepetition_penaltyseedstoptemperaturetool_choicetoolstop_ktop_p

Details

TokenizerOther
Max Completion8,000 tokens
Provider Context120K tokens
ModeratedNo