Skip to content

[FEATURE] WaveNet: Model head, layer array head variable kernel size#249

Open
sdatkinson wants to merge 4 commits intomainfrom
wavenet-head
Open

[FEATURE] WaveNet: Model head, layer array head variable kernel size#249
sdatkinson wants to merge 4 commits intomainfrom
wavenet-head

Conversation

@sdatkinson
Copy link
Copy Markdown
Owner

  • Support heads on WaveNet models
  • WawveNet layer arrays' rechannel_heads are no longer Conv1x1s but can have kernel_size > 1 (general Conv1D).

Implement PostStackHead matching Python Head export: weight order is layer
arrays, post-stack conv weights, then head_scale. Parse head JSON in
parse_config_json; output channels come from head.out_channels when present.
Reject post-stack head on SlimmableWavenet. Add unit tests for RF and smoke process.

Made-with: Cursor
Match neural-amp-modeler: head export omits in_channels; derive from last
layer head_size. Optional legacy in_channels is validated when present.

Made-with: Cursor
Apply each head layer's activation to that layer's actual input buffer, and add a regression test for two-layer heads so multi-layer kernel stacks remain numerically correct.

Made-with: Cursor
- LayerArrayParams gains head_kernel_size; head rechannel uses Conv1D; RF includes head kernel.
- parse_config_json: nested head or legacy head_size/head_bias; slimming rejects k!=1.
- Tests: LayerArrayParams call sites, test_layer_head_config legacy vs new schema.

Made-with: Cursor
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant