Add missing kpack triton compile options#2935
Draft
leonling-ll wants to merge 1 commit intoROCm:developfrom
Draft
Add missing kpack triton compile options#2935leonling-ll wants to merge 1 commit intoROCm:developfrom
kpack triton compile options#2935leonling-ll wants to merge 1 commit intoROCm:developfrom
Conversation
kpack triton compile optionskpack triton compile options
There was a problem hiding this comment.
Pull request overview
This PR fixes a missing parameter issue in PyTorch's Triton compiler integration. The kpack parameter, which controls packing behavior for Triton GEMM kernels, was not being passed through the compilation pipeline, causing kernels that require kpack > 1 to fall back to the default value of 1 and miss expected performance optimizations.
Changes:
- Added
kpackparameter handling in the_create_compile_optionsmethod to extract and pass the value from compile metadata to Triton compiler options - Added
kpackas an optional parameter to thetriton_configfunction and ensured it's properly propagated to the config's kwargs
Author
|
Same as pytorch#173179 |
|
Jenkins build for 8bf933dcb1257b73d0e1cf7922e8e8861db331cf commit finished as FAILURE |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
The
kpackis missed to pass to Triton Compiler in Torch CachingAutotuner, Triton Compiler would take 1 as default value.When there are Triton GEMM kernels that perfer
kpack> 1, they cannot get the performance as expected.This change aims to add the
kpackback.