Meta Llama 4 Scout

Llama 4 Scout is a versatile small model served with 1M token context suitable for a wide range of applications including chatbots and question answering.

Model Specifications

Context window 1,000,000 tokens
Max output 16,384 tokens
Knowledge cutoff February 2024
Multipart messages No
Vision capabilities No

Default Parameters

Temperature 0.6
Top P 1.0
Frequency penalty 0.0
Presence penalty 0.0
Top K 40

Supported Parameters

Temperature

Supported

Controls randomness: Lower values make output more deterministic, higher values more creative.

Range: 0.0 - 1.0 Default: 0.6

More models from Meta