Compact Grok-3 for fast, cost-effective inference.
grok-3-mini
AndAI LLM Hub routes requests to the best providers that are able to handle your prompt size and parameters.
xai/grok-3-mini