Skip to main content

🔧 Apply CLIPSeg

essentials/segmentation
ApplyCLIPSeg+

Example

JSON Example
{
  "class_type": "ApplyCLIPSeg+",
  "inputs": {
    "clip_seg": [
      "node_id",
      0
    ],
    "image": [
      "node_id",
      0
    ],
    "prompt": "",
    "threshold": 0.4,
    "smooth": 9,
    "dilate": 0,
    "blur": 0
  }
}

This example shows required inputs only. Connection values like ["node_id", 0] should reference actual node IDs from your workflow.

Inputs

NameTypeStatusConstraintsDefault
clip_segCLIP_SEGrequired--
imageIMAGErequired--
promptSTRINGrequired-""
thresholdFLOATrequiredmin: 0, max: 1, step: 0.050.4
smoothINTrequiredmin: 0, max: 32, step: 19
dilateINTrequiredmin: -32, max: 32, step: 10
blurINTrequiredmin: 0, max: 64, step: 10

Outputs

IndexNameTypeIs ListConnection Reference
0MASKMASKNo["{node_id}", 0]
How to connect to these outputs

To connect another node's input to an output from this node, use the connection reference format:

["node_id", output_index]

Where node_id is the ID of this ApplyCLIPSeg+ node in your workflow, and output_index is the index from the table above.

Example

If this node has ID "5" in your workflow:

  • MASK (MASK): ["5", 0]
Was this page helpful?