Skip to main content

NABLA Attention KJ

KJNodes/experimental
NABLA_AttentionKJ

Experimental node for patching attention mode to use NABLA sparse attention for video models, currently only works with Kadinsky5

Example

JSON Example
{
  "class_type": "NABLA_AttentionKJ",
  "inputs": {
    "model": [
      "node_id",
      0
    ],
    "latent": [
      "node_id",
      0
    ],
    "window_time": 11,
    "window_width": 3,
    "window_height": 3,
    "sparsity": 0.9,
    "torch_compile": true
  }
}

This example shows required inputs only. Connection values like ["node_id", 0] should reference actual node IDs from your workflow.

Inputs

NameTypeStatusConstraintsDefault
modelMODELrequired--
latent?LATENTrequired--
window_time?INTrequiredmin: 111
window_width?INTrequiredmin: 13
window_height?INTrequiredmin: 13
sparsityFLOATrequiredmin: 0, max: 1, step: 0.010.9
torch_compile?BOOLEANrequired-true

Outputs

IndexNameTypeIs ListConnection Reference
0MODELMODELNo["{node_id}", 0]
How to connect to these outputs

To connect another node's input to an output from this node, use the connection reference format:

["node_id", output_index]

Where node_id is the ID of this NABLA_AttentionKJ node in your workflow, and output_index is the index from the table above.

Example

If this node has ID "5" in your workflow:

  • MODEL (MODEL): ["5", 0]
Was this page helpful?