Back to all models
mistralai
Mistral: Mistral Nemo
A 12B parameter model with a 128k token context length built by Mistral in collaboration with NVIDIA. The model is multilingual, supporting English, French, German, Spanish, Italian, Portuguese, Chinese, Japanese, Korean, Arabic, and Hindi. It supports function calling and is released under the Apache 2.0 license.
mistralai/mistral-nemo
Context Size
131.072K
Input Price
123 Ks/M
Output Price
246 Ks/M
Architecture
Text
Supported Parameters
frequency_penaltymax_tokensmin_ppresence_penaltyrepetition_penaltyresponse_formatseedstopstructured_outputstemperaturetool_choicetoolstop_ktop_p
Details
TokenizerMistral
Instruct Typemistral
Max Completion16,384 tokens
Provider Context131.072K tokens
ModeratedNo