

Groq LLAMA3.3 70B Versatile
The 70B parameter version of Meta's Llama model delivers state of the art performance (running on Groq)
Model Specifications
Default Parameters
Supported Parameters
Temperature
SupportedControls randomness: Lower values make output more deterministic, higher values more creative.
More models from Meta

Meta Llama 4 Scout
Llama 4 Scout is a versatile small model served with 1M token context suitable for a wide range of applications including chatbots and question answering.

Meta Llama 4 Maverick
Llama 4 Maverick is a powerful medium model with 1M token context designed for advanced reasoning tasks and multimodal understanding.