Skip to main content

Load CLIP

advanced/loaders
CLIPLoader

[Recipes] stable_diffusion: clip-l stable_cascade: clip-g sd3: t5 xxl/ clip-g / clip-l stable_audio: t5 base mochi: t5 xxl cosmos: old t5 xxl lumina2: gemma 2 2B wan: umt5 xxl hidream: llama-3.1 (Recommend) or t5 omnigen2: qwen vl 2.5 3B

Example

JSON Example
{
  "class_type": "CLIPLoader",
  "inputs": {
    "clip_name": "https://huggingface.co/openai/clip-vit-large-patch14/resolve/main/model.safetensors",
    "type": "stable_diffusion"
  }
}

This example shows required inputs only. Connection values like ["node_id", 0] should reference actual node IDs from your workflow.

Inputs

NameTypeStatusConstraintsDefault
clip_nameENUM
0 options
    URL: CLIP
    required--
    typeENUM
    18 options
    • stable_diffusion
    • stable_cascade
    • sd3
    • stable_audio
    • mochi
    • ltxv
    • pixart
    • cosmos
    • lumina2
    • wan
    • hidream
    • chroma
    • ace
    • omnigen2
    • qwen_image
    • hunyuan_image
    • flux2
    • ovis
    required--
    deviceENUM
    2 options
    • default
    • cpu
    optional--

    Outputs

    IndexNameTypeIs ListConnection Reference
    0CLIPCLIPNo["{node_id}", 0]
    How to connect to these outputs

    To connect another node's input to an output from this node, use the connection reference format:

    ["node_id", output_index]

    Where node_id is the ID of this CLIPLoader node in your workflow, and output_index is the index from the table above.

    Example

    If this node has ID "5" in your workflow:

    • CLIP (CLIP): ["5", 0]
    Was this page helpful?