Models & Inference
How to check if a model is available on serverless?
Web UI
Go to https://app.fireworks.ai/models?filter=LLM&serverless=true
Programmatically
You can use the
is_available_on_serverless
method on the LLM object in our
Build SDK to check if a model is
available on serverless.