Skip to content

Support biased SwiGLU in MXFP4 MoE#3735

Open
XiaobingSuper wants to merge 1 commit intoROCm:developfrom
XiaobingSuper:xiaobing/swiglu
Open

Support biased SwiGLU in MXFP4 MoE#3735
XiaobingSuper wants to merge 1 commit intoROCm:developfrom
XiaobingSuper:xiaobing/swiglu

Conversation

@XiaobingSuper
Copy link
Copy Markdown

Add bias-aware SwiGLU handling to the MXFP4 MoE gridwise paths and keep split-k epilogues from accumulating stale data when k_batch is one.

Proposed changes

  • Add bias support to MXFP4 MoE BNS/BPreShuffle device and gridwise paths.
  • Add swiglustep_and_mul handling for MXFP4 MoE with GPT-OSS style SwiGLU clamp/alpha behavior.
  • Fix CK-Tile MoE GEMM1 split-k epilogue behavior so k_batch == 1 stores instead of accumulating stale output.

Checklist

Please put an x into the boxes that apply. You can also fill these out after creating the PR. If you're not sure, please don't hesitate to ask.

  • I have added tests relevant to the introduced functionality, and the unit tests are passing locally
  • I have added the test to REGRESSION_TESTS list defined at the top of CMakeLists.txt in tests/CMakeLists.txt, IF the test takes more than 30 seconds to run.
  • I have added inline documentation which enables the maintainers with understanding the motivation
  • I have removed the stale documentation which is no longer relevant after this pull request
  • (If this change is user-facing) I have added release notes which provide the end users with a brief summary of the improvement from this pull request
  • I have run clang-format on all changed files
  • Any dependent changes have been merged

Discussion

If this is a relatively large or complex change, feel free to start a discussion by explaining why you chose the solution you did and what alternatives you considered

Add bias-aware SwiGLU handling to the MXFP4 MoE gridwise paths and keep split-k epilogues from accumulating stale data when k_batch is one.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant