Meta llama3-3 hosted on GCP Vertex AI in us-west1.
Provider: All Meta models | GCP Vertex AI
Inference regions: us-west1
https://us-west1-aiplatform.googleapis.com/v1/projects/{project}/locations/{region}/publishers/{publisher}/models/{model}:generateContentInstall: pip install google-cloud-aiplatform
import vertexai
from vertexai.generative_models import GenerativeModel
vertexai.init(project="your-project-id", location="us-west1")
model = GenerativeModel("llama3-3")
response = model.generate_content("Hello, how are you?")
print(response.text)| Parameter | Type | Description |
|---|---|---|
| maxOutputTokens | integer | Maximum number of tokens to generate. (≥1) |
| temperature | float | Controls randomness. (0–2) Default: 1. |
| topP | float | Nucleus sampling threshold. (0–1) Default: 1. |
| topK | integer | Top-K sampling. (≥0) |
| stopSequences | string | Sequences where generation stops. |