董林卓

@dlinnn

董林卓 暂无简介

组织

0 3
董林卓的个人主页
/
关注的仓库(2)

    Watch omniai/omni-npu

    A vLLM (0.12.0) out-of-tree platform plugin that enables running vLLM on NPU (Ascend/torch_npu).

    最近更新: 10天前

    Watch 董林卓/omni_infer forked from omniai/omni_infer

    Omni_Infer is a suite of inference accelerators designed for the Ascend NPU platform, offering native support and an expanding feature set.

    最近更新: 1个月前

搜索帮助