Get a simple guide to streamline AI deployment with Novita AI + Hugging Face.
Novita AI’s integration with Hugging Face Platform enables advanced serverless inference capabilities. This provides direct access to Hub model pages through optimized infrastructure, offering developers a streamlined setup experience. With full support for Hugging Face’s JavaScript and Python SDKs, Novita AI simplifies model deployment and scaling without infrastructure management.Our comprehensive guide walks you through Novita AI implementation on Hugging Face, covering both web interface and SDK integration methods.
Step 3: Explore Compatible Providers on Model Pages
Model pages display third-party inference providers compatible with the selected model (the ones that are compatible with the current model, sorted by user preference).
from huggingface_hub import InferenceClientclient = InferenceClient( provider="novita", api_key="xxxxxxxxxxxxxxxxxxxxxxxx", # optional, required from 2nd calling, get from https://novita.ai/settings/key-management)# an example questionmessages = [ dict( role="user", content='Sally (a girl) has 3 brothers. Each brother has 2 sisters. How many sisters does Sally have?', ),]completion = client.chat.completions.create( model="deepseek-ai/DeepSeek-R1", messages=messages, max_tokens=512,)print(completion.choices[0].message)