[Feature] support rl_tp_degree (#3934)
Some checks failed
CE Compile Job / ce_job_pre_check (push) Has been cancelled
CE Compile Job / print_ce_job_pre_check_outputs (push) Has been cancelled
CE Compile Job / FD-Clone-Linux (push) Has been cancelled
CE Compile Job / Show Code Archive Output (push) Has been cancelled
CE Compile Job / BUILD_SM8090 (push) Has been cancelled
CE Compile Job / BUILD_SM8689 (push) Has been cancelled
CE Compile Job / CE_UPLOAD (push) Has been cancelled

* [Feature] support rl_tp_degree

* add rl_tp_degree in lmhead

* add rl_tp_degree in bias

* fix split_axis=0 in bias

* fix split_axis in weight

* fix bias rl_tp_degree

* fix bias rl_tp_degree

* change attr to dict

---------

Co-authored-by: Jiang-Jia-Jun <163579578+Jiang-Jia-Jun@users.noreply.github.com>
This commit is contained in:
lizhenyun01
2025-09-08 16:20:32 +08:00
committed by GitHub
parent fa2369271d
commit d40a1046de
3 changed files with 31 additions and 1 deletions

View File

@@ -77,6 +77,11 @@ class VocabParallelEmbedding(nn.Layer):
)
if self.world_size > 1:
set_weight_attrs(self.embeddings.weight, {"output_dim": False})
set_weight_attrs(
self.embeddings.weight,
{"rl_need_attr": {"rl_tp_degree": fd_config.parallel_config.tensor_parallel_size}},
)
else:
# column cut embedding
self.embeddings = nn.Embedding(