Process massive datasets with Kimi K2, featuring an expansive 131K context window for long-document analysis. This model delivers cost-effective pricing at $1.00/1M input and $3.00/1M output tokens, native tool calling support. Access Kimi K2 via the LLM Gateway API with up to 16K output tokens.
Tokens
Tokens
Tokens
Kimi K2 by LLM Gateway costs $1.00 per 1M input tokens and $3.00 per 1M output tokens. Cached reads cost $0.50 per 1M tokens.