Web UI

Go to https://app.fireworks.ai/models?filter=LLM&serverless=true

Programmatically

You can use the is_available_on_serverless method on the LLM object in our Build SDK to check if a model is available on serverless.

llm = LLM(model="llama4-maverick-instruct-basic", deployment_type="auto")
print(llm.is_available_on_serverless()) # True

llm = LLM(model="qwen2p5-7b-instruct", deployment_type="auto")
print(llm.is_available_on_serverless()) # False