Load CLIP
advanced/loaders
CLIPLoader[Recipes] stable_diffusion: clip-l stable_cascade: clip-g sd3: t5 xxl/ clip-g / clip-l stable_audio: t5 base mochi: t5 xxl cosmos: old t5 xxl lumina2: gemma 2 2B wan: umt5 xxl hidream: llama-3.1 (Recommend) or t5 omnigen2: qwen vl 2.5 3B
Example
JSON Example
{
"class_type": "CLIPLoader",
"inputs": {
"clip_name": "https://huggingface.co/openai/clip-vit-large-patch14/resolve/main/model.safetensors",
"type": "stable_diffusion"
}
}This example shows required inputs only. Connection values like ["node_id", 0] should reference actual node IDs from your workflow.
Inputs
| Name | Type | Status | Constraints | Default |
|---|---|---|---|---|
clip_name | ENUM0 options | required | - | - |
type | ENUM18 options
| required | - | - |
device | ENUM2 options
| optional | - | - |
Outputs
| Index | Name | Type | Is List | Connection Reference |
|---|---|---|---|---|
0 | CLIP | CLIP | No | ["{node_id}", 0] |
How to connect to these outputs
To connect another node's input to an output from this node, use the connection reference format:
["node_id", output_index]Where node_id is the ID of this CLIPLoader node in your workflow, and output_index is the index from the table above.
Example
If this node has ID "5" in your workflow:
CLIP (CLIP):["5", 0]
Was this page helpful?