We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.
You must be logged in to block users.
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
Python 18.9k 1.9k
中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
Python 7.1k 567
中文羊驼大模型三期项目 (Chinese Llama-3 LLMs) developed from Meta Llama 3
Python 2k 170
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
Python 10.2k 1.4k
中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
Python 610 43
Revisiting Pre-trained Models for Chinese Natural Language Processing (MacBERT)
709 61
There was an error while loading. Please reload this page.