Process massive datasets with Meta-Llama-3_3-70B-Instruct, featuring an expansive 131K context window for long-document analysis. This model delivers cost-effective pricing at $0.74/1M input and $0.74/1M output tokens, native tool calling support, open weights architecture. Access Meta-Llama-3_3-70B-Instruct via the OVHcloud AI Endpoints API with up to 131K output tokens.
Tokens
Tokens
Tokens
Meta-Llama-3_3-70B-Instruct by OVHcloud AI Endpoints costs $0.74 per 1M input tokens and $0.74 per 1M output tokens.