Platforms
Platforms allow connecting to LLMs from different providers through a single API. Railtracks has support for connecting to the following major LLM platforms:
- Azure AI Foundry
- Ollama
- HuggingFace
- Portkey
The code remains the same as LLM Providers with the provider name being replaced with the platform name.
Quick Start Examples
import railtracks as rt
# make sure to configure your environment variables for Ollama
model = rt.llm.OllamaLLM("deepseek-r1:8b")
Tool Calling Support
For HuggingFace serverless inference models, you need to make sure that the model you are using supports tool calling. We DO NOT check for tool calling support in HuggingFace models. If you are using a model that does not support tool calling, it will default to regular chat, even if the tool_nodes parameter is provided.
In case of HuggingFace, model_name must be of the format:
huggingface/<provider>/<hf_org_or_user>/<hf_model><provider>/<hf_org_or_user>/<hf_model>"
Here are a few example models that you can use: