【FIX】Change the name of sparse attn from moba to plas (#4006)
Some checks failed
CE Compile Job / ce_job_pre_check (push) Has been cancelled
CE Compile Job / print_ce_job_pre_check_outputs (push) Has been cancelled
CE Compile Job / FD-Clone-Linux (push) Has been cancelled
CE Compile Job / Show Code Archive Output (push) Has been cancelled
CE Compile Job / BUILD_SM8090 (push) Has been cancelled
CE Compile Job / BUILD_SM8689 (push) Has been cancelled
CE Compile Job / CE_UPLOAD (push) Has been cancelled

* 更新文档

* 【docs】 update readme (#4000)

* 更新文档

* update readme

* update docs

* 【FIX】Change the name of sparse attn from moba to plas (#3845)

* 更新文档

* 更新文档

* 更新文档

* 更新文档

* 修改moba为plas

* code style

* update ci

* code style

* update ci

* code style

---------

Co-authored-by: Jiang-Jia-Jun <163579578+Jiang-Jia-Jun@users.noreply.github.com>
This commit is contained in:
yangjianfengo1
2025-09-10 10:04:29 +08:00
committed by GitHub
parent 35b8362804
commit dfc94371ee
14 changed files with 151 additions and 151 deletions

View File

@@ -64,9 +64,9 @@ class CUDAPlatform(Platform):
elif selected_backend == _Backend.FLASH_ATTN:
logger.info("Using FLASH ATTN backend.")
return "fastdeploy.model_executor.layers.attention.FlashAttentionBackend"
elif selected_backend == _Backend.MOBA_ATTN:
logger.info("Using MOBA ATTN backend.")
return "fastdeploy.model_executor.layers.attention.MobaAttentionBackend"
elif selected_backend == _Backend.PLAS_ATTN:
logger.info("Using PLAS ATTN backend.")
return "fastdeploy.model_executor.layers.attention.PlasAttentionBackend"
else:
raise ValueError(
"Invalid attention backend you specified.\n"